Στοιχεία ομάδας

Αριθμός ομάδας: 22

Πρώτο μέλος:
Ονοματεπώνυμο: Ψαρουδάκης Ανδρέας
Αριθμός μητρώου: 03116001
Email: andreaspsaroudakis@gmail.com

Δεύτερο μέλος:
Ονοματεπώνυμο: Τζε Χριστίνα-Ουρανία
Αριθμός μητρώου: 03116079
Email: xristina.rania.tze@gmail.com

Εισαγωγή

Σκοπός της παρούσας εργαστηριακής άσκησης είναι η βελτιστοποίηση της απόδοσης μοντέλων Βαθιάς Μάθησης στο σύνολο δεδομένων CIFAR-100 με χρήση της βιβλιοθήκης TensorFlow 2. Τα μοντέλα που βελτιστοποιούμε είναι τόσο from scratch όσο και δίκτυα μεταφοράς μάθησης (Trasnfer learning). Ξεκινάμε, εξετάζοντας τα μοντέλα πάνω σε ένα υποσύνολο 20 κλάσεων ενώ στη συνέχεια αυξάνουμε τον αριθμό τους σταδιακά μέχρι τις 80. Μελετάμε επίσης ξεχωριστά την επίδραση του batch size αλλά και του optimizer στην επίδοση των βελτιστοποιημένων μοντέλων μας ενώ σημειώνουμε και τους χρόνους εκπαίδευση για το πρόβλημα των 80 κλάσεων.

Βαθιά μάθηση στο CIFAR-100

Εισαγωγή και επισκόπηση του συνόλου δεδομένων

In [ ]:
from __future__ import absolute_import, division, print_function, unicode_literals # legacy compatibility

import tensorflow as tf
from tensorflow.keras import datasets, layers, models
from tensorflow.keras.preprocessing.image import ImageDataGenerator
from tensorflow.keras.utils import to_categorical
from keras.regularizers import l2

import time
import numpy as np
import pandas as pd
import datetime
import matplotlib.pyplot as plt
In [ ]:
# helper functions

# select from from_list elements with index in index_list
def select_from_list(from_list, index_list):
  filtered_list= [from_list[i] for i in index_list]
  return(filtered_list)

# append in filtered_list the index of each element of unfilterd_list if it exists in in target_list
def get_ds_index(unfiliterd_list, target_list):
  index = 0
  filtered_list=[]
  for i_ in unfiliterd_list:
    if i_[0] in target_list:
      filtered_list.append(index)
    index += 1
  return(filtered_list)

# select a url for a unique subset of CIFAR-100 with 20, 40, 60, or 80 classes
def select_classes_number(classes_number = 20):
  cifar100_20_classes_url = "https://pastebin.com/raw/nzE1n98V"
  cifar100_40_classes_url = "https://pastebin.com/raw/zGX4mCNP"
  cifar100_60_classes_url = "https://pastebin.com/raw/nsDTd3Qn"
  cifar100_80_classes_url = "https://pastebin.com/raw/SNbXz700"
  if classes_number == 20:
    return cifar100_20_classes_url
  elif classes_number == 40:
    return cifar100_40_classes_url
  elif classes_number == 60:
    return cifar100_60_classes_url
  elif classes_number == 80:
    return cifar100_80_classes_url
  else:
    return -1
In [ ]:
# load the entire dataset
(x_train_all, y_train_all), (x_test_all, y_test_all) = tf.keras.datasets.cifar100.load_data(label_mode='fine')
Downloading data from https://www.cs.toronto.edu/~kriz/cifar-100-python.tar.gz
169009152/169001437 [==============================] - 4s 0us/step
In [ ]:
print(x_train_all.shape)
(50000, 32, 32, 3)

Η κάθε ομάδα θα δουλέψει με ένα μοναδικό ξεχωριστό υποσύνολο του CIFAR-100. Στο επόμενο κελί, αντικαταστήστε την τιμή της μεταβλητής team_seed με τον αριθμό της ομάδας σας.

In [ ]:
# REPLACE WITH YOUR TEAM NUMBER
team_seed = 22

Στο επόμενο κελί μπορείτε να διαλέξετε το πλήθος των κατηγορίων σας: 20 (default), 40, 60 ή 80.

In [ ]:
# select the number of classes
cifar100_classes_url = select_classes_number()

Δημιουργούμε το μοναδικό dataset της ομάδας μας:

In [ ]:
team_classes = pd.read_csv(cifar100_classes_url, sep=',', header=None)
CIFAR100_LABELS_LIST = pd.read_csv('https://pastebin.com/raw/qgDaNggt', sep=',', header=None).astype(str).values.tolist()[0]

our_index = team_classes.iloc[team_seed,:].values.tolist()
print(our_index)
our_classes = select_from_list(CIFAR100_LABELS_LIST, our_index)
train_index = get_ds_index(y_train_all, our_index)
test_index = get_ds_index(y_test_all, our_index)

x_train_ds = np.asarray(select_from_list(x_train_all, train_index))
y_train_ds = np.asarray(select_from_list(y_train_all, train_index))
x_test_ds = np.asarray(select_from_list(x_test_all, test_index))
y_test_ds = np.asarray(select_from_list(y_test_all, test_index))
[1, 6, 9, 19, 25, 26, 27, 29, 32, 33, 39, 42, 53, 68, 79, 86, 87, 88, 91, 98]
In [ ]:
# print our classes
print(our_classes)
[' aquarium_fish', ' bee', ' bottle', ' cattle', ' couch', ' crab', ' crocodile', ' dinosaur', ' flatfish', ' forest', ' keyboard', ' leopard', ' orange', ' road', ' spider', ' telephone', ' television', ' tiger', ' trout', ' woman']
In [ ]:
CLASSES_NUM=len(our_classes)
print(CLASSES_NUM)
20
In [ ]:
# get (train) dataset dimensions
data_size, img_rows, img_cols, img_channels = x_train_ds.shape

# set validation set percentage (wrt the training set size)
validation_percentage = 0.15
val_size = round(validation_percentage * data_size)

# Reserve val_size samples for validation and normalize all values
x_val = x_train_ds[-val_size:]/255
y_val = y_train_ds[-val_size:]
x_train = x_train_ds[:-val_size]/255
y_train = y_train_ds[:-val_size]
x_test = x_test_ds/255
y_test = y_test_ds

print(len(x_val))

# summarize loaded dataset
print('Train: X=%s, y=%s' % (x_train.shape, y_train.shape))
print('Validation: X=%s, y=%s' % (x_val.shape, y_val.shape))
print('Test: X=%s, y=%s' % (x_test.shape, y_test.shape))

# get class label from class index
def class_label_from_index(fine_category):
  return(CIFAR100_LABELS_LIST[fine_category.item(0)])

# plot first few images
plt.figure(figsize=(6, 6))
for i in range(9):
	# define subplot
  plt.subplot(330 + 1 + i).set_title(class_label_from_index(y_train[i]))
	# plot raw pixel data
  plt.imshow(x_train[i], cmap=plt.get_cmap('gray'))
  #show the figure
plt.show()
1500
Train: X=(8500, 32, 32, 3), y=(8500, 1)
Validation: X=(1500, 32, 32, 3), y=(1500, 1)
Test: X=(2000, 32, 32, 3), y=(2000, 1)

Μετασχηματισμός των labels

Επειδή ο αριθμός των κλάσεων με τις οποίες δουλεύουμε (20,40,60 ή 80) είναι διαφορετικός του αριθμού των κλάσεων του Cifar (100) μετασχηματίζουμε αρχικά τα labels της ομάδας μας σε νέες τιμές εντός του διαστήματος [0,num_of_classes). Για τον σκοπό αυτό ορίζουμε τις συναρτήσεις create_dict και create_new_labels. Η πρώτη δέχεται σαν όρισμα μία ταξινομημένη λίστα, old_labels, η οποία περιέχει τα labels όπως αυτά προκύπτουν με βάση τον κωδικό της ομάδας μας, και επιστρέφει ένα λεξικό με τις αντιστοιχίσεις των νέων labels σε νέα από 0 μέχρι num_of_classes - 1. Η δεύτερη δημιουργεί και επιστρέφει έναν πίνακα διαστάσεων όσο ο αρχικός πίνακας y_train ο οποίος περιέχει το νέο label όλων των εικόνων με βάση την αντιστοίχιση που ορίζει το λεξικό της create_dict.

In [ ]:
def create_dict(old_labels):
  d = dict()
  counter = 0
  for i in range(len(old_labels)):
    d[old_labels[i]] = counter
    counter = counter + 1
  return d
In [ ]:
def create_new_labels(old_labels,y_train):
  d = create_dict(old_labels)
  new_labels = np.zeros((y_train.shape[0],y_train.shape[1])).astype(np.uint8)
  for i in range(y_train.shape[0]): # For every image replace the old label with the new one
    new_labels[i] = d[y_train[i][0]]
  return new_labels
In [ ]:
y_train = create_new_labels(our_index,y_train)
y_val = create_new_labels(our_index,y_val)
y_test = create_new_labels(our_index,y_test)

Συναρτήσεις εκπαίδευσης

Θα χρησιμοποιήσουμε την ιδιότητα data prefetch του tf2:

In [ ]:
# we user prefetch https://www.tensorflow.org/api_docs/python/tf/data/Dataset#prefetch 
# see also AUTOTUNE
# the dataset is now "infinite"

BATCH_SIZE = 32
IMG_SIZE = 224
AUTOTUNE = tf.data.experimental.AUTOTUNE # https://www.tensorflow.org/guide/data_performance

def _input_fn(x,y, BATCH_SIZE):
  ds = tf.data.Dataset.from_tensor_slices((x,y))
  ds = ds.shuffle(buffer_size=data_size)
  ds = ds.repeat()
  ds = ds.batch(BATCH_SIZE)
  ds = ds.prefetch(buffer_size=AUTOTUNE)
  return ds

def resize_transform(image,label):
  return tf.image.resize(image, (IMG_SIZE, IMG_SIZE)),label

train_ds =_input_fn(x_train,y_train, BATCH_SIZE) #PrefetchDataset object
validation_ds =_input_fn(x_val,y_val, BATCH_SIZE) #PrefetchDataset object
test_ds =_input_fn(x_test,y_test, BATCH_SIZE) #PrefetchDataset object

train_ds_res = train_ds.map(resize_transform)
validation_ds_res = validation_ds.map(resize_transform)
test_ds_res = test_ds.map(resize_transform)

# steps_per_epoch and validation_steps for training and validation: https://www.tensorflow.org/guide/keras/train_and_evaluate

def train_model(model, train_dataset = train_ds, validation_dataset = validation_ds, epochs = 100, callbacks = None, steps_per_epoch = int(np.ceil(x_train.shape[0]/BATCH_SIZE)), validation_steps = int(np.ceil(x_val.shape[0]/BATCH_SIZE))):
  history = model.fit(train_dataset, epochs=epochs, steps_per_epoch=steps_per_epoch, validation_data=validation_dataset, validation_steps=validation_steps, callbacks=callbacks)
  return(history)

Γραφικές παραστάσεις εκπαίδευσης και απόδοση στο σύνολο ελέγχου

In [ ]:
# plot diagnostic learning curves
def summarize_diagnostics(history):
      plt.figure(figsize=(8, 8))
      plt.suptitle('Training Curves')
      # plot loss
      plt.subplot(211)
      plt.title('Cross Entropy Loss')
      plt.plot(history.history['loss'], color='blue', label='train')
      plt.plot(history.history['val_loss'], color='orange', label='val')
      plt.legend(loc='upper right')
      # plot accuracy
      plt.subplot(212)
      plt.title('Classification Accuracy')
      plt.plot(history.history['accuracy'], color='blue', label='train')
      plt.plot(history.history['val_accuracy'], color='orange', label='val')
      plt.legend(loc='lower right')
      return plt
 
# print test set evaluation metrics
def model_evaluation(model, evaluation_dataset, evaluation_steps):
      print('\n\033[1mTest set evaluation metrics\033[0m')
      print('---------------------------')
      loss0, accuracy0 = model.evaluate(evaluation_dataset, steps = evaluation_steps,verbose=0)
      print("\033[1mLoss:     {:.3f}".format(loss0))
      print("\033[1mAccuracy: {:.3f}%".format(accuracy0*100))
      return (loss0, accuracy0)

def model_report(model, history, evaluation_dataset = test_ds, evaluation_steps = int(np.ceil(x_test.shape[0]/BATCH_SIZE))):
      plt = summarize_diagnostics(history)
      plt.show()
      return model_evaluation(model, evaluation_dataset, evaluation_steps)

Μοντέλα δικτύων

Ξεκινάμε ορίζοντας κάποια μοντέλα "from scratch" και εξετάζουμε την επίδοση τους στο πρόβλημα κατηγοριοποίησης των 20 κλάσεων.

Δίκτυα "from scratch"

Ορίζουμε δύο λεξικά, losses και accuracies, τα οποία έχουν για κλειδιά τα ονόματα των μοντέλων που εξετάζουμε και για τιμές τα losses και accuracies αντίστοιχα.

In [ ]:
losses = {}
accuracies = {}

Simple CNN

Το μοντέλο Simple CNN που ορίζουμε διαθέτει 3 συνελικτικά (convolutional) επίπεδα, εκ των οποίων το πρώτο είναι 32 φίλτρων διάστασης 3x3, το δεύτερο είναι 64 φίλτρων 3x3 και το τρίτο είναι 64 φίλτρων 3x3. Και τα τρία ενεργοποιούνται μέσω συναρτήσεων ReLU. Μετά από τα δύο πρώτα Convolutional layers, υπάρχει ένα επίπεδο υποδειγματοληψίας τύπου MaxPooling 2x2 για τη μείωση της διάστασης των εικόνων με παράλληλη διατήρηση της χρήσιμης πληροφορίας. Στο τέλος βρίσκονται 2 Fully Connected επίπεδα, από τα οποία το πρώτο περιλαμβάνει 64 νευρώνες και το δεύτερο και τελευταίο (output layer) περιλαμβάνει πλήθος νευρώνων ίσο με τον αριθμό των εκάστοτε κλάσεων που ορίζουμε. Για το προτελευταίο layer επιλέγουμε ως συνάρτηση ενεργοποίησης μια ReLU ενώ το output layer διαθέτει μια συνάρτηση ενεργοποίησης (activation fucntion) softmax για την κανονικοποίηση των τιμών στο εύρος [0,1] (πιθανότητες).

In [ ]:
# a simple CNN https://www.tensorflow.org/tutorials/images/cnn

def init_simple_model(summary, optimizer = tf.optimizers.Adam, lr = 0.00005):
  model = models.Sequential()
  model.add(layers.Conv2D(32, (3, 3), activation='relu', input_shape=(32,32,3)))
  model.add(layers.MaxPooling2D((2, 2)))
  model.add(layers.Conv2D(64, (3, 3), activation='relu'))
  model.add(layers.MaxPooling2D((2, 2)))
  model.add(layers.Conv2D(64, (3, 3), activation='relu'))
  model.add(layers.Flatten())
  model.add(layers.Dense(64, activation='relu'))
  model.add(layers.Dense(CLASSES_NUM, activation='softmax'))
  model.compile(optimizer=optimizer(learning_rate=lr), loss=tf.keras.losses.sparse_categorical_crossentropy, metrics=["accuracy"])
  if summary: 
    model.summary()
  return model
In [ ]:
SIMPLE_MODEL = init_simple_model(summary = True)
tf.keras.utils.plot_model(SIMPLE_MODEL, to_file='model.png', show_shapes=True, show_layer_names=False,rankdir='LR', expand_nested=False, dpi=80)
Model: "sequential"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
conv2d (Conv2D)              (None, 30, 30, 32)        896       
_________________________________________________________________
max_pooling2d (MaxPooling2D) (None, 15, 15, 32)        0         
_________________________________________________________________
conv2d_1 (Conv2D)            (None, 13, 13, 64)        18496     
_________________________________________________________________
max_pooling2d_1 (MaxPooling2 (None, 6, 6, 64)          0         
_________________________________________________________________
conv2d_2 (Conv2D)            (None, 4, 4, 64)          36928     
_________________________________________________________________
flatten (Flatten)            (None, 1024)              0         
_________________________________________________________________
dense (Dense)                (None, 64)                65600     
_________________________________________________________________
dense_1 (Dense)              (None, 20)                1300      
=================================================================
Total params: 123,220
Trainable params: 123,220
Non-trainable params: 0
_________________________________________________________________
Out[ ]:
In [ ]:
SIMPLE_MODEL_history = train_model(SIMPLE_MODEL)
Epoch 1/100
266/266 [==============================] - 3s 5ms/step - loss: 2.9499 - accuracy: 0.0951 - val_loss: 2.7037 - val_accuracy: 0.1862
Epoch 2/100
266/266 [==============================] - 1s 4ms/step - loss: 2.6786 - accuracy: 0.1851 - val_loss: 2.5047 - val_accuracy: 0.2334
Epoch 3/100
266/266 [==============================] - 1s 4ms/step - loss: 2.4985 - accuracy: 0.2490 - val_loss: 2.3568 - val_accuracy: 0.2992
Epoch 4/100
266/266 [==============================] - 1s 4ms/step - loss: 2.3540 - accuracy: 0.2950 - val_loss: 2.2667 - val_accuracy: 0.3211
Epoch 5/100
266/266 [==============================] - 1s 4ms/step - loss: 2.2671 - accuracy: 0.3207 - val_loss: 2.1998 - val_accuracy: 0.3211
Epoch 6/100
266/266 [==============================] - 1s 4ms/step - loss: 2.1668 - accuracy: 0.3464 - val_loss: 2.1467 - val_accuracy: 0.3457
Epoch 7/100
266/266 [==============================] - 1s 4ms/step - loss: 2.1014 - accuracy: 0.3701 - val_loss: 2.1096 - val_accuracy: 0.3717
Epoch 8/100
266/266 [==============================] - 1s 4ms/step - loss: 2.0577 - accuracy: 0.3832 - val_loss: 2.0424 - val_accuracy: 0.3890
Epoch 9/100
266/266 [==============================] - 1s 4ms/step - loss: 1.9845 - accuracy: 0.4038 - val_loss: 1.9824 - val_accuracy: 0.4009
Epoch 10/100
266/266 [==============================] - 1s 4ms/step - loss: 1.9531 - accuracy: 0.4128 - val_loss: 1.9572 - val_accuracy: 0.3969
Epoch 11/100
266/266 [==============================] - 1s 4ms/step - loss: 1.8804 - accuracy: 0.4377 - val_loss: 1.9311 - val_accuracy: 0.4269
Epoch 12/100
266/266 [==============================] - 1s 4ms/step - loss: 1.8592 - accuracy: 0.4499 - val_loss: 1.9008 - val_accuracy: 0.4368
Epoch 13/100
266/266 [==============================] - 1s 4ms/step - loss: 1.8521 - accuracy: 0.4480 - val_loss: 1.8802 - val_accuracy: 0.4388
Epoch 14/100
266/266 [==============================] - 1s 4ms/step - loss: 1.7956 - accuracy: 0.4664 - val_loss: 1.8809 - val_accuracy: 0.4375
Epoch 15/100
266/266 [==============================] - 1s 4ms/step - loss: 1.7700 - accuracy: 0.4650 - val_loss: 1.8452 - val_accuracy: 0.4541
Epoch 16/100
266/266 [==============================] - 1s 4ms/step - loss: 1.7332 - accuracy: 0.4810 - val_loss: 1.8467 - val_accuracy: 0.4475
Epoch 17/100
266/266 [==============================] - 1s 4ms/step - loss: 1.7208 - accuracy: 0.4861 - val_loss: 1.8654 - val_accuracy: 0.4488
Epoch 18/100
266/266 [==============================] - 1s 4ms/step - loss: 1.7333 - accuracy: 0.4858 - val_loss: 1.8127 - val_accuracy: 0.4674
Epoch 19/100
266/266 [==============================] - 1s 4ms/step - loss: 1.6691 - accuracy: 0.4956 - val_loss: 1.8542 - val_accuracy: 0.4574
Epoch 20/100
266/266 [==============================] - 1s 4ms/step - loss: 1.6529 - accuracy: 0.5098 - val_loss: 1.8659 - val_accuracy: 0.4501
Epoch 21/100
266/266 [==============================] - 1s 4ms/step - loss: 1.6565 - accuracy: 0.5020 - val_loss: 1.7846 - val_accuracy: 0.4661
Epoch 22/100
266/266 [==============================] - 1s 4ms/step - loss: 1.6408 - accuracy: 0.5191 - val_loss: 1.7571 - val_accuracy: 0.4847
Epoch 23/100
266/266 [==============================] - 1s 4ms/step - loss: 1.6069 - accuracy: 0.5251 - val_loss: 1.7444 - val_accuracy: 0.4887
Epoch 24/100
266/266 [==============================] - 1s 4ms/step - loss: 1.5480 - accuracy: 0.5342 - val_loss: 1.7374 - val_accuracy: 0.4980
Epoch 25/100
266/266 [==============================] - 1s 4ms/step - loss: 1.5341 - accuracy: 0.5393 - val_loss: 1.7451 - val_accuracy: 0.4947
Epoch 26/100
266/266 [==============================] - 1s 4ms/step - loss: 1.5390 - accuracy: 0.5412 - val_loss: 1.7225 - val_accuracy: 0.5000
Epoch 27/100
266/266 [==============================] - 1s 4ms/step - loss: 1.5321 - accuracy: 0.5432 - val_loss: 1.7262 - val_accuracy: 0.5007
Epoch 28/100
266/266 [==============================] - 1s 4ms/step - loss: 1.4913 - accuracy: 0.5482 - val_loss: 1.7176 - val_accuracy: 0.4927
Epoch 29/100
266/266 [==============================] - 1s 4ms/step - loss: 1.4777 - accuracy: 0.5704 - val_loss: 1.6901 - val_accuracy: 0.5193
Epoch 30/100
266/266 [==============================] - 1s 4ms/step - loss: 1.4504 - accuracy: 0.5675 - val_loss: 1.7086 - val_accuracy: 0.4927
Epoch 31/100
266/266 [==============================] - 1s 4ms/step - loss: 1.4446 - accuracy: 0.5745 - val_loss: 1.7037 - val_accuracy: 0.5047
Epoch 32/100
266/266 [==============================] - 1s 4ms/step - loss: 1.4317 - accuracy: 0.5789 - val_loss: 1.7080 - val_accuracy: 0.4967
Epoch 33/100
266/266 [==============================] - 1s 4ms/step - loss: 1.4131 - accuracy: 0.5771 - val_loss: 1.6880 - val_accuracy: 0.5160
Epoch 34/100
266/266 [==============================] - 1s 4ms/step - loss: 1.3928 - accuracy: 0.5803 - val_loss: 1.6672 - val_accuracy: 0.5173
Epoch 35/100
266/266 [==============================] - 1s 4ms/step - loss: 1.3681 - accuracy: 0.5990 - val_loss: 1.6760 - val_accuracy: 0.5126
Epoch 36/100
266/266 [==============================] - 1s 4ms/step - loss: 1.3595 - accuracy: 0.5897 - val_loss: 1.6727 - val_accuracy: 0.5239
Epoch 37/100
266/266 [==============================] - 1s 4ms/step - loss: 1.3489 - accuracy: 0.5974 - val_loss: 1.6657 - val_accuracy: 0.5146
Epoch 38/100
266/266 [==============================] - 1s 4ms/step - loss: 1.3606 - accuracy: 0.5920 - val_loss: 1.6334 - val_accuracy: 0.5332
Epoch 39/100
266/266 [==============================] - 1s 4ms/step - loss: 1.3286 - accuracy: 0.6045 - val_loss: 1.6601 - val_accuracy: 0.5266
Epoch 40/100
266/266 [==============================] - 1s 4ms/step - loss: 1.2957 - accuracy: 0.6130 - val_loss: 1.6426 - val_accuracy: 0.5399
Epoch 41/100
266/266 [==============================] - 1s 4ms/step - loss: 1.2974 - accuracy: 0.6132 - val_loss: 1.6404 - val_accuracy: 0.5299
Epoch 42/100
266/266 [==============================] - 1s 4ms/step - loss: 1.2805 - accuracy: 0.6280 - val_loss: 1.6284 - val_accuracy: 0.5359
Epoch 43/100
266/266 [==============================] - 1s 4ms/step - loss: 1.2683 - accuracy: 0.6232 - val_loss: 1.6382 - val_accuracy: 0.5293
Epoch 44/100
266/266 [==============================] - 1s 4ms/step - loss: 1.2496 - accuracy: 0.6337 - val_loss: 1.6377 - val_accuracy: 0.5312
Epoch 45/100
266/266 [==============================] - 1s 4ms/step - loss: 1.2461 - accuracy: 0.6285 - val_loss: 1.6225 - val_accuracy: 0.5479
Epoch 46/100
266/266 [==============================] - 1s 4ms/step - loss: 1.2256 - accuracy: 0.6321 - val_loss: 1.6366 - val_accuracy: 0.5312
Epoch 47/100
266/266 [==============================] - 1s 4ms/step - loss: 1.2327 - accuracy: 0.6289 - val_loss: 1.6177 - val_accuracy: 0.5339
Epoch 48/100
266/266 [==============================] - 1s 4ms/step - loss: 1.2213 - accuracy: 0.6337 - val_loss: 1.6624 - val_accuracy: 0.5213
Epoch 49/100
266/266 [==============================] - 1s 4ms/step - loss: 1.1852 - accuracy: 0.6496 - val_loss: 1.6013 - val_accuracy: 0.5372
Epoch 50/100
266/266 [==============================] - 1s 4ms/step - loss: 1.2010 - accuracy: 0.6492 - val_loss: 1.6241 - val_accuracy: 0.5399
Epoch 51/100
266/266 [==============================] - 1s 4ms/step - loss: 1.1457 - accuracy: 0.6494 - val_loss: 1.6432 - val_accuracy: 0.5273
Epoch 52/100
266/266 [==============================] - 1s 4ms/step - loss: 1.1284 - accuracy: 0.6619 - val_loss: 1.6318 - val_accuracy: 0.5426
Epoch 53/100
266/266 [==============================] - 1s 4ms/step - loss: 1.1492 - accuracy: 0.6621 - val_loss: 1.6104 - val_accuracy: 0.5479
Epoch 54/100
266/266 [==============================] - 1s 4ms/step - loss: 1.1094 - accuracy: 0.6788 - val_loss: 1.6214 - val_accuracy: 0.5465
Epoch 55/100
266/266 [==============================] - 1s 4ms/step - loss: 1.0992 - accuracy: 0.6699 - val_loss: 1.6105 - val_accuracy: 0.5465
Epoch 56/100
266/266 [==============================] - 1s 4ms/step - loss: 1.1013 - accuracy: 0.6729 - val_loss: 1.6531 - val_accuracy: 0.5233
Epoch 57/100
266/266 [==============================] - 1s 4ms/step - loss: 1.0893 - accuracy: 0.6788 - val_loss: 1.6535 - val_accuracy: 0.5259
Epoch 58/100
266/266 [==============================] - 1s 4ms/step - loss: 1.0591 - accuracy: 0.6829 - val_loss: 1.6017 - val_accuracy: 0.5512
Epoch 59/100
266/266 [==============================] - 1s 4ms/step - loss: 1.0763 - accuracy: 0.6777 - val_loss: 1.6328 - val_accuracy: 0.5306
Epoch 60/100
266/266 [==============================] - 1s 4ms/step - loss: 1.0339 - accuracy: 0.6912 - val_loss: 1.6063 - val_accuracy: 0.5406
Epoch 61/100
266/266 [==============================] - 1s 4ms/step - loss: 1.0382 - accuracy: 0.6923 - val_loss: 1.6231 - val_accuracy: 0.5412
Epoch 62/100
266/266 [==============================] - 1s 4ms/step - loss: 1.0139 - accuracy: 0.7001 - val_loss: 1.5968 - val_accuracy: 0.5426
Epoch 63/100
266/266 [==============================] - 1s 4ms/step - loss: 1.0408 - accuracy: 0.6925 - val_loss: 1.6092 - val_accuracy: 0.5419
Epoch 64/100
266/266 [==============================] - 1s 4ms/step - loss: 1.0156 - accuracy: 0.6896 - val_loss: 1.6047 - val_accuracy: 0.5492
Epoch 65/100
266/266 [==============================] - 1s 4ms/step - loss: 0.9947 - accuracy: 0.7138 - val_loss: 1.6296 - val_accuracy: 0.5279
Epoch 66/100
266/266 [==============================] - 1s 4ms/step - loss: 0.9712 - accuracy: 0.7086 - val_loss: 1.6921 - val_accuracy: 0.5253
Epoch 67/100
266/266 [==============================] - 1s 4ms/step - loss: 0.9711 - accuracy: 0.7066 - val_loss: 1.6221 - val_accuracy: 0.5439
Epoch 68/100
266/266 [==============================] - 1s 4ms/step - loss: 0.9864 - accuracy: 0.7091 - val_loss: 1.6152 - val_accuracy: 0.5406
Epoch 69/100
266/266 [==============================] - 1s 4ms/step - loss: 0.9218 - accuracy: 0.7297 - val_loss: 1.6371 - val_accuracy: 0.5412
Epoch 70/100
266/266 [==============================] - 1s 4ms/step - loss: 0.9316 - accuracy: 0.7228 - val_loss: 1.6174 - val_accuracy: 0.5419
Epoch 71/100
266/266 [==============================] - 1s 4ms/step - loss: 0.9318 - accuracy: 0.7280 - val_loss: 1.6262 - val_accuracy: 0.5372
Epoch 72/100
266/266 [==============================] - 1s 4ms/step - loss: 0.9141 - accuracy: 0.7309 - val_loss: 1.6471 - val_accuracy: 0.5465
Epoch 73/100
266/266 [==============================] - 1s 4ms/step - loss: 0.9113 - accuracy: 0.7257 - val_loss: 1.6851 - val_accuracy: 0.5392
Epoch 74/100
266/266 [==============================] - 1s 4ms/step - loss: 0.8914 - accuracy: 0.7265 - val_loss: 1.6574 - val_accuracy: 0.5432
Epoch 75/100
266/266 [==============================] - 1s 4ms/step - loss: 0.8917 - accuracy: 0.7330 - val_loss: 1.6380 - val_accuracy: 0.5366
Epoch 76/100
266/266 [==============================] - 1s 4ms/step - loss: 0.8568 - accuracy: 0.7443 - val_loss: 1.6431 - val_accuracy: 0.5485
Epoch 77/100
266/266 [==============================] - 1s 4ms/step - loss: 0.8419 - accuracy: 0.7382 - val_loss: 1.6318 - val_accuracy: 0.5532
Epoch 78/100
266/266 [==============================] - 1s 4ms/step - loss: 0.8408 - accuracy: 0.7416 - val_loss: 1.6656 - val_accuracy: 0.5339
Epoch 79/100
266/266 [==============================] - 1s 4ms/step - loss: 0.8432 - accuracy: 0.7499 - val_loss: 1.6608 - val_accuracy: 0.5392
Epoch 80/100
266/266 [==============================] - 1s 4ms/step - loss: 0.8243 - accuracy: 0.7501 - val_loss: 1.6676 - val_accuracy: 0.5439
Epoch 81/100
266/266 [==============================] - 1s 4ms/step - loss: 0.7845 - accuracy: 0.7622 - val_loss: 1.6741 - val_accuracy: 0.5299
Epoch 82/100
266/266 [==============================] - 1s 4ms/step - loss: 0.8045 - accuracy: 0.7591 - val_loss: 1.6524 - val_accuracy: 0.5492
Epoch 83/100
266/266 [==============================] - 1s 4ms/step - loss: 0.8026 - accuracy: 0.7560 - val_loss: 1.6860 - val_accuracy: 0.5445
Epoch 84/100
266/266 [==============================] - 1s 4ms/step - loss: 0.7834 - accuracy: 0.7664 - val_loss: 1.7052 - val_accuracy: 0.5465
Epoch 85/100
266/266 [==============================] - 1s 4ms/step - loss: 0.7654 - accuracy: 0.7739 - val_loss: 1.6855 - val_accuracy: 0.5439
Epoch 86/100
266/266 [==============================] - 1s 4ms/step - loss: 0.7537 - accuracy: 0.7711 - val_loss: 1.6779 - val_accuracy: 0.5432
Epoch 87/100
266/266 [==============================] - 1s 4ms/step - loss: 0.7480 - accuracy: 0.7722 - val_loss: 1.7598 - val_accuracy: 0.5206
Epoch 88/100
266/266 [==============================] - 1s 4ms/step - loss: 0.7416 - accuracy: 0.7824 - val_loss: 1.7255 - val_accuracy: 0.5412
Epoch 89/100
266/266 [==============================] - 1s 4ms/step - loss: 0.7333 - accuracy: 0.7762 - val_loss: 1.7227 - val_accuracy: 0.5372
Epoch 90/100
266/266 [==============================] - 1s 4ms/step - loss: 0.7411 - accuracy: 0.7712 - val_loss: 1.6867 - val_accuracy: 0.5452
Epoch 91/100
266/266 [==============================] - 1s 4ms/step - loss: 0.7312 - accuracy: 0.7824 - val_loss: 1.7366 - val_accuracy: 0.5239
Epoch 92/100
266/266 [==============================] - 1s 4ms/step - loss: 0.6898 - accuracy: 0.7956 - val_loss: 1.7214 - val_accuracy: 0.5346
Epoch 93/100
266/266 [==============================] - 1s 4ms/step - loss: 0.6858 - accuracy: 0.7902 - val_loss: 1.7701 - val_accuracy: 0.5465
Epoch 94/100
266/266 [==============================] - 1s 4ms/step - loss: 0.6723 - accuracy: 0.8073 - val_loss: 1.7522 - val_accuracy: 0.5326
Epoch 95/100
266/266 [==============================] - 1s 4ms/step - loss: 0.6678 - accuracy: 0.8003 - val_loss: 1.7254 - val_accuracy: 0.5426
Epoch 96/100
266/266 [==============================] - 1s 4ms/step - loss: 0.6499 - accuracy: 0.8070 - val_loss: 1.7517 - val_accuracy: 0.5419
Epoch 97/100
266/266 [==============================] - 1s 4ms/step - loss: 0.6430 - accuracy: 0.8096 - val_loss: 1.7442 - val_accuracy: 0.5419
Epoch 98/100
266/266 [==============================] - 1s 4ms/step - loss: 0.6505 - accuracy: 0.8129 - val_loss: 1.7743 - val_accuracy: 0.5392
Epoch 99/100
266/266 [==============================] - 1s 4ms/step - loss: 0.6198 - accuracy: 0.8196 - val_loss: 1.7805 - val_accuracy: 0.5312
Epoch 100/100
266/266 [==============================] - 1s 4ms/step - loss: 0.6318 - accuracy: 0.8083 - val_loss: 1.7885 - val_accuracy: 0.5359
In [ ]:
loss, accuracy = model_report(SIMPLE_MODEL, SIMPLE_MODEL_history)
losses["SIMPLE_MODEL"] = loss
accuracies["SIMPLE_MODEL"] = accuracy
Test set evaluation metrics
---------------------------
Loss:     1.867
Accuracy: 53.770%

CNN1

Το μοντέλο CNN1 που ορίζουμε διαθέτει 3 συνελικτικά (convolutional) επίπεδα, εκ των οποίων το πρώτο είναι 32 φίλτρων διάστασης 3x3, το δεύτερο είναι 64 φίλτρων 3x3 και το τρίτο είναι 128 φίτλρων 3x3. Και τα τρία ενεργοποιούνται μέσω συναρτήσεων ReLU. Μετά από τα δύο πρώτα Convolutional layers, υπάρχει ένα επίπεδο υποδειγματοληψίας τύπου MaxPooling 2x2, ενώ μετά το τρίτο συνελικτικό επίπεδο βρίσκεται ένα επίπεδο Average Pooling 2x2. Στο τέλος βρίσκονται 2 Fully Connected επίπεδα, από τα οποία το πρώτο περιλαμβάνει 1024 νευρώνες και το δεύτερο και τελευταίο (output layer) περιλαμβάνει πλήθος νευρώνων ίσο με τον αριθμό των εκάστοτε κλάσεων που ορίζουμε. Για το προτελευταίο layer επιλέγουμε ως συνάρτηση ενεργοποίησης μια ReLU ενώ το output layer διαθέτει μια συνάρτηση ενεργοποίησης (activation fucntion) softmax για την κανονικοποίηση των τιμών στο εύρος [0,1] (πιθανότητες).

In [ ]:
def init_cnn1_model(summary, optimizer = tf.optimizers.Adam, lr = 0.00005):
  model = models.Sequential()
  model.add(layers.Conv2D(32, (3, 3), activation='relu', input_shape=(32, 32, 3))) 
  model.add(layers.MaxPooling2D((2, 2)))
  model.add(layers.Conv2D(64, (3, 3), activation='relu'))
  model.add(layers.MaxPooling2D((2, 2)))
  model.add(layers.Conv2D(128, (3, 3), activation='relu'))
  model.add(layers.AveragePooling2D((2, 2)))
  model.add(layers.Flatten())
  model.add(layers.Dense(1024,activation='relu'))
  model.add(layers.Dense(CLASSES_NUM,activation='softmax'))

  model.compile(optimizer=optimizer(learning_rate=lr), loss=tf.keras.losses.sparse_categorical_crossentropy, metrics=["accuracy"])
  if summary: 
    model.summary()
  return model
In [ ]:
CNN1_MODEL = init_cnn1_model(summary = True)
tf.keras.utils.plot_model(CNN1_MODEL, to_file='model.png', show_shapes=True, show_layer_names=False,rankdir='LR', expand_nested=False, dpi=80)
Model: "sequential_1"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
conv2d_3 (Conv2D)            (None, 30, 30, 32)        896       
_________________________________________________________________
max_pooling2d_2 (MaxPooling2 (None, 15, 15, 32)        0         
_________________________________________________________________
conv2d_4 (Conv2D)            (None, 13, 13, 64)        18496     
_________________________________________________________________
max_pooling2d_3 (MaxPooling2 (None, 6, 6, 64)          0         
_________________________________________________________________
conv2d_5 (Conv2D)            (None, 4, 4, 128)         73856     
_________________________________________________________________
average_pooling2d (AveragePo (None, 2, 2, 128)         0         
_________________________________________________________________
flatten_1 (Flatten)          (None, 512)               0         
_________________________________________________________________
dense_2 (Dense)              (None, 1024)              525312    
_________________________________________________________________
dense_3 (Dense)              (None, 20)                20500     
=================================================================
Total params: 639,060
Trainable params: 639,060
Non-trainable params: 0
_________________________________________________________________
Out[ ]:
In [ ]:
CNN1_MODEL_history = train_model(CNN1_MODEL)
Epoch 1/100
266/266 [==============================] - 2s 4ms/step - loss: 2.9156 - accuracy: 0.1059 - val_loss: 2.6142 - val_accuracy: 0.2041
Epoch 2/100
266/266 [==============================] - 1s 4ms/step - loss: 2.5928 - accuracy: 0.2177 - val_loss: 2.4248 - val_accuracy: 0.2746
Epoch 3/100
266/266 [==============================] - 1s 4ms/step - loss: 2.4168 - accuracy: 0.2641 - val_loss: 2.2641 - val_accuracy: 0.3039
Epoch 4/100
266/266 [==============================] - 1s 4ms/step - loss: 2.2462 - accuracy: 0.3239 - val_loss: 2.1470 - val_accuracy: 0.3557
Epoch 5/100
266/266 [==============================] - 1s 4ms/step - loss: 2.1190 - accuracy: 0.3605 - val_loss: 2.0958 - val_accuracy: 0.3670
Epoch 6/100
266/266 [==============================] - 1s 4ms/step - loss: 2.0549 - accuracy: 0.3820 - val_loss: 2.0114 - val_accuracy: 0.3949
Epoch 7/100
266/266 [==============================] - 1s 4ms/step - loss: 1.9312 - accuracy: 0.4149 - val_loss: 1.9381 - val_accuracy: 0.4162
Epoch 8/100
266/266 [==============================] - 1s 4ms/step - loss: 1.8958 - accuracy: 0.4334 - val_loss: 1.9343 - val_accuracy: 0.4275
Epoch 9/100
266/266 [==============================] - 1s 4ms/step - loss: 1.8355 - accuracy: 0.4500 - val_loss: 1.8410 - val_accuracy: 0.4488
Epoch 10/100
266/266 [==============================] - 1s 4ms/step - loss: 1.7686 - accuracy: 0.4639 - val_loss: 1.8164 - val_accuracy: 0.4468
Epoch 11/100
266/266 [==============================] - 1s 4ms/step - loss: 1.7017 - accuracy: 0.4872 - val_loss: 1.8288 - val_accuracy: 0.4515
Epoch 12/100
266/266 [==============================] - 1s 4ms/step - loss: 1.7253 - accuracy: 0.4775 - val_loss: 1.7525 - val_accuracy: 0.4840
Epoch 13/100
266/266 [==============================] - 1s 4ms/step - loss: 1.6699 - accuracy: 0.4980 - val_loss: 1.7304 - val_accuracy: 0.4907
Epoch 14/100
266/266 [==============================] - 1s 4ms/step - loss: 1.6312 - accuracy: 0.5038 - val_loss: 1.8003 - val_accuracy: 0.4608
Epoch 15/100
266/266 [==============================] - 1s 4ms/step - loss: 1.6205 - accuracy: 0.5033 - val_loss: 1.7566 - val_accuracy: 0.4668
Epoch 16/100
266/266 [==============================] - 1s 4ms/step - loss: 1.5894 - accuracy: 0.5175 - val_loss: 1.6746 - val_accuracy: 0.5033
Epoch 17/100
266/266 [==============================] - 1s 4ms/step - loss: 1.5435 - accuracy: 0.5258 - val_loss: 1.7125 - val_accuracy: 0.4907
Epoch 18/100
266/266 [==============================] - 1s 4ms/step - loss: 1.5215 - accuracy: 0.5326 - val_loss: 1.6967 - val_accuracy: 0.4880
Epoch 19/100
266/266 [==============================] - 1s 4ms/step - loss: 1.5124 - accuracy: 0.5401 - val_loss: 1.6501 - val_accuracy: 0.5140
Epoch 20/100
266/266 [==============================] - 1s 4ms/step - loss: 1.4728 - accuracy: 0.5456 - val_loss: 1.6344 - val_accuracy: 0.5160
Epoch 21/100
266/266 [==============================] - 1s 4ms/step - loss: 1.4418 - accuracy: 0.5649 - val_loss: 1.6341 - val_accuracy: 0.5093
Epoch 22/100
266/266 [==============================] - 1s 4ms/step - loss: 1.4291 - accuracy: 0.5607 - val_loss: 1.6273 - val_accuracy: 0.5180
Epoch 23/100
266/266 [==============================] - 1s 4ms/step - loss: 1.3717 - accuracy: 0.5814 - val_loss: 1.6313 - val_accuracy: 0.5259
Epoch 24/100
266/266 [==============================] - 1s 4ms/step - loss: 1.3928 - accuracy: 0.5605 - val_loss: 1.6237 - val_accuracy: 0.5193
Epoch 25/100
266/266 [==============================] - 1s 4ms/step - loss: 1.3510 - accuracy: 0.5883 - val_loss: 1.5935 - val_accuracy: 0.5299
Epoch 26/100
266/266 [==============================] - 1s 4ms/step - loss: 1.3234 - accuracy: 0.6021 - val_loss: 1.5862 - val_accuracy: 0.5286
Epoch 27/100
266/266 [==============================] - 1s 4ms/step - loss: 1.3092 - accuracy: 0.5909 - val_loss: 1.5488 - val_accuracy: 0.5406
Epoch 28/100
266/266 [==============================] - 1s 4ms/step - loss: 1.2961 - accuracy: 0.6051 - val_loss: 1.5427 - val_accuracy: 0.5452
Epoch 29/100
266/266 [==============================] - 1s 4ms/step - loss: 1.2730 - accuracy: 0.6072 - val_loss: 1.5731 - val_accuracy: 0.5472
Epoch 30/100
266/266 [==============================] - 1s 4ms/step - loss: 1.2410 - accuracy: 0.6261 - val_loss: 1.5619 - val_accuracy: 0.5445
Epoch 31/100
266/266 [==============================] - 1s 4ms/step - loss: 1.2471 - accuracy: 0.6209 - val_loss: 1.5345 - val_accuracy: 0.5512
Epoch 32/100
266/266 [==============================] - 1s 4ms/step - loss: 1.2115 - accuracy: 0.6386 - val_loss: 1.5442 - val_accuracy: 0.5485
Epoch 33/100
266/266 [==============================] - 1s 4ms/step - loss: 1.2136 - accuracy: 0.6268 - val_loss: 1.5372 - val_accuracy: 0.5519
Epoch 34/100
266/266 [==============================] - 1s 4ms/step - loss: 1.1649 - accuracy: 0.6449 - val_loss: 1.5222 - val_accuracy: 0.5559
Epoch 35/100
266/266 [==============================] - 1s 4ms/step - loss: 1.1669 - accuracy: 0.6422 - val_loss: 1.5201 - val_accuracy: 0.5552
Epoch 36/100
266/266 [==============================] - 1s 4ms/step - loss: 1.1402 - accuracy: 0.6515 - val_loss: 1.5218 - val_accuracy: 0.5552
Epoch 37/100
266/266 [==============================] - 1s 4ms/step - loss: 1.1484 - accuracy: 0.6503 - val_loss: 1.5044 - val_accuracy: 0.5585
Epoch 38/100
266/266 [==============================] - 1s 4ms/step - loss: 1.0995 - accuracy: 0.6587 - val_loss: 1.5404 - val_accuracy: 0.5499
Epoch 39/100
266/266 [==============================] - 1s 4ms/step - loss: 1.0856 - accuracy: 0.6628 - val_loss: 1.5038 - val_accuracy: 0.5585
Epoch 40/100
266/266 [==============================] - 1s 4ms/step - loss: 1.0718 - accuracy: 0.6695 - val_loss: 1.5151 - val_accuracy: 0.5691
Epoch 41/100
266/266 [==============================] - 1s 4ms/step - loss: 1.0456 - accuracy: 0.6833 - val_loss: 1.4955 - val_accuracy: 0.5638
Epoch 42/100
266/266 [==============================] - 1s 4ms/step - loss: 1.0376 - accuracy: 0.6860 - val_loss: 1.5203 - val_accuracy: 0.5565
Epoch 43/100
266/266 [==============================] - 1s 4ms/step - loss: 1.0002 - accuracy: 0.6951 - val_loss: 1.4800 - val_accuracy: 0.5738
Epoch 44/100
266/266 [==============================] - 1s 4ms/step - loss: 0.9830 - accuracy: 0.7058 - val_loss: 1.5291 - val_accuracy: 0.5572
Epoch 45/100
266/266 [==============================] - 1s 4ms/step - loss: 0.9528 - accuracy: 0.7097 - val_loss: 1.5197 - val_accuracy: 0.5532
Epoch 46/100
266/266 [==============================] - 1s 4ms/step - loss: 0.9525 - accuracy: 0.7117 - val_loss: 1.4806 - val_accuracy: 0.5672
Epoch 47/100
266/266 [==============================] - 1s 4ms/step - loss: 0.9256 - accuracy: 0.7175 - val_loss: 1.4823 - val_accuracy: 0.5725
Epoch 48/100
266/266 [==============================] - 1s 4ms/step - loss: 0.9227 - accuracy: 0.7248 - val_loss: 1.4968 - val_accuracy: 0.5645
Epoch 49/100
266/266 [==============================] - 1s 4ms/step - loss: 0.8774 - accuracy: 0.7262 - val_loss: 1.5139 - val_accuracy: 0.5718
Epoch 50/100
266/266 [==============================] - 1s 4ms/step - loss: 0.8678 - accuracy: 0.7335 - val_loss: 1.4618 - val_accuracy: 0.5751
Epoch 51/100
266/266 [==============================] - 1s 4ms/step - loss: 0.8627 - accuracy: 0.7314 - val_loss: 1.5031 - val_accuracy: 0.5691
Epoch 52/100
266/266 [==============================] - 1s 4ms/step - loss: 0.8362 - accuracy: 0.7507 - val_loss: 1.5011 - val_accuracy: 0.5638
Epoch 53/100
266/266 [==============================] - 1s 4ms/step - loss: 0.8433 - accuracy: 0.7376 - val_loss: 1.4961 - val_accuracy: 0.5652
Epoch 54/100
266/266 [==============================] - 1s 4ms/step - loss: 0.7939 - accuracy: 0.7597 - val_loss: 1.4832 - val_accuracy: 0.5731
Epoch 55/100
266/266 [==============================] - 1s 4ms/step - loss: 0.7631 - accuracy: 0.7672 - val_loss: 1.5172 - val_accuracy: 0.5711
Epoch 56/100
266/266 [==============================] - 1s 4ms/step - loss: 0.7775 - accuracy: 0.7675 - val_loss: 1.5675 - val_accuracy: 0.5691
Epoch 57/100
266/266 [==============================] - 1s 4ms/step - loss: 0.7383 - accuracy: 0.7751 - val_loss: 1.5062 - val_accuracy: 0.5652
Epoch 58/100
266/266 [==============================] - 1s 4ms/step - loss: 0.7224 - accuracy: 0.7862 - val_loss: 1.5455 - val_accuracy: 0.5678
Epoch 59/100
266/266 [==============================] - 1s 4ms/step - loss: 0.7129 - accuracy: 0.7811 - val_loss: 1.5034 - val_accuracy: 0.5831
Epoch 60/100
266/266 [==============================] - 1s 4ms/step - loss: 0.6894 - accuracy: 0.7909 - val_loss: 1.5232 - val_accuracy: 0.5738
Epoch 61/100
266/266 [==============================] - 1s 4ms/step - loss: 0.6696 - accuracy: 0.7953 - val_loss: 1.5792 - val_accuracy: 0.5672
Epoch 62/100
266/266 [==============================] - 1s 4ms/step - loss: 0.6417 - accuracy: 0.8033 - val_loss: 1.5307 - val_accuracy: 0.5665
Epoch 63/100
266/266 [==============================] - 1s 4ms/step - loss: 0.6371 - accuracy: 0.8061 - val_loss: 1.5694 - val_accuracy: 0.5685
Epoch 64/100
266/266 [==============================] - 1s 4ms/step - loss: 0.6224 - accuracy: 0.8163 - val_loss: 1.5358 - val_accuracy: 0.5831
Epoch 65/100
266/266 [==============================] - 1s 4ms/step - loss: 0.6258 - accuracy: 0.8121 - val_loss: 1.5631 - val_accuracy: 0.5731
Epoch 66/100
266/266 [==============================] - 1s 4ms/step - loss: 0.6043 - accuracy: 0.8183 - val_loss: 1.6205 - val_accuracy: 0.5638
Epoch 67/100
266/266 [==============================] - 1s 4ms/step - loss: 0.5768 - accuracy: 0.8188 - val_loss: 1.5590 - val_accuracy: 0.5818
Epoch 68/100
266/266 [==============================] - 1s 4ms/step - loss: 0.5582 - accuracy: 0.8374 - val_loss: 1.6094 - val_accuracy: 0.5838
Epoch 69/100
266/266 [==============================] - 1s 4ms/step - loss: 0.5564 - accuracy: 0.8319 - val_loss: 1.5808 - val_accuracy: 0.5731
Epoch 70/100
266/266 [==============================] - 1s 4ms/step - loss: 0.5223 - accuracy: 0.8452 - val_loss: 1.6362 - val_accuracy: 0.5785
Epoch 71/100
266/266 [==============================] - 1s 4ms/step - loss: 0.5192 - accuracy: 0.8495 - val_loss: 1.6117 - val_accuracy: 0.5851
Epoch 72/100
266/266 [==============================] - 1s 4ms/step - loss: 0.4852 - accuracy: 0.8580 - val_loss: 1.6341 - val_accuracy: 0.5758
Epoch 73/100
266/266 [==============================] - 1s 4ms/step - loss: 0.4930 - accuracy: 0.8636 - val_loss: 1.6408 - val_accuracy: 0.5818
Epoch 74/100
266/266 [==============================] - 1s 4ms/step - loss: 0.4469 - accuracy: 0.8711 - val_loss: 1.6643 - val_accuracy: 0.5738
Epoch 75/100
266/266 [==============================] - 1s 4ms/step - loss: 0.4547 - accuracy: 0.8701 - val_loss: 1.6553 - val_accuracy: 0.5798
Epoch 76/100
266/266 [==============================] - 1s 4ms/step - loss: 0.4245 - accuracy: 0.8726 - val_loss: 1.6656 - val_accuracy: 0.5745
Epoch 77/100
266/266 [==============================] - 1s 4ms/step - loss: 0.4092 - accuracy: 0.8858 - val_loss: 1.6958 - val_accuracy: 0.5824
Epoch 78/100
266/266 [==============================] - 1s 4ms/step - loss: 0.3772 - accuracy: 0.8948 - val_loss: 1.7184 - val_accuracy: 0.5831
Epoch 79/100
266/266 [==============================] - 1s 4ms/step - loss: 0.3855 - accuracy: 0.8945 - val_loss: 1.7388 - val_accuracy: 0.5811
Epoch 80/100
266/266 [==============================] - 1s 4ms/step - loss: 0.3890 - accuracy: 0.8905 - val_loss: 1.7464 - val_accuracy: 0.5751
Epoch 81/100
266/266 [==============================] - 1s 4ms/step - loss: 0.3443 - accuracy: 0.9067 - val_loss: 1.8405 - val_accuracy: 0.5711
Epoch 82/100
266/266 [==============================] - 1s 4ms/step - loss: 0.3466 - accuracy: 0.9052 - val_loss: 1.7743 - val_accuracy: 0.5731
Epoch 83/100
266/266 [==============================] - 1s 4ms/step - loss: 0.3370 - accuracy: 0.9124 - val_loss: 1.8097 - val_accuracy: 0.5705
Epoch 84/100
266/266 [==============================] - 1s 4ms/step - loss: 0.3105 - accuracy: 0.9192 - val_loss: 1.8261 - val_accuracy: 0.5844
Epoch 85/100
266/266 [==============================] - 1s 4ms/step - loss: 0.3093 - accuracy: 0.9191 - val_loss: 1.8270 - val_accuracy: 0.5851
Epoch 86/100
266/266 [==============================] - 1s 4ms/step - loss: 0.3007 - accuracy: 0.9227 - val_loss: 1.8206 - val_accuracy: 0.5824
Epoch 87/100
266/266 [==============================] - 1s 4ms/step - loss: 0.2704 - accuracy: 0.9337 - val_loss: 1.8538 - val_accuracy: 0.5718
Epoch 88/100
266/266 [==============================] - 1s 4ms/step - loss: 0.2686 - accuracy: 0.9265 - val_loss: 1.8640 - val_accuracy: 0.5751
Epoch 89/100
266/266 [==============================] - 1s 4ms/step - loss: 0.2436 - accuracy: 0.9439 - val_loss: 1.8759 - val_accuracy: 0.5824
Epoch 90/100
266/266 [==============================] - 1s 4ms/step - loss: 0.2373 - accuracy: 0.9460 - val_loss: 1.9197 - val_accuracy: 0.5718
Epoch 91/100
266/266 [==============================] - 1s 4ms/step - loss: 0.2336 - accuracy: 0.9428 - val_loss: 1.9576 - val_accuracy: 0.5818
Epoch 92/100
266/266 [==============================] - 1s 4ms/step - loss: 0.2285 - accuracy: 0.9448 - val_loss: 1.9442 - val_accuracy: 0.5864
Epoch 93/100
266/266 [==============================] - 1s 4ms/step - loss: 0.2219 - accuracy: 0.9440 - val_loss: 2.0511 - val_accuracy: 0.5718
Epoch 94/100
266/266 [==============================] - 1s 4ms/step - loss: 0.2007 - accuracy: 0.9565 - val_loss: 2.0346 - val_accuracy: 0.5672
Epoch 95/100
266/266 [==============================] - 1s 4ms/step - loss: 0.1931 - accuracy: 0.9552 - val_loss: 2.0045 - val_accuracy: 0.5805
Epoch 96/100
266/266 [==============================] - 1s 4ms/step - loss: 0.1740 - accuracy: 0.9631 - val_loss: 2.1143 - val_accuracy: 0.5638
Epoch 97/100
266/266 [==============================] - 1s 4ms/step - loss: 0.1985 - accuracy: 0.9516 - val_loss: 2.0502 - val_accuracy: 0.5758
Epoch 98/100
266/266 [==============================] - 1s 4ms/step - loss: 0.1617 - accuracy: 0.9655 - val_loss: 2.0321 - val_accuracy: 0.5858
Epoch 99/100
266/266 [==============================] - 1s 4ms/step - loss: 0.1605 - accuracy: 0.9689 - val_loss: 2.1202 - val_accuracy: 0.5725
Epoch 100/100
266/266 [==============================] - 1s 4ms/step - loss: 0.1485 - accuracy: 0.9684 - val_loss: 2.1383 - val_accuracy: 0.5738
In [ ]:
loss, accuracy = model_report(CNN1_MODEL, CNN1_MODEL_history)
losses["CNN1"] = loss
accuracies["CNN1"] = accuracy
Test set evaluation metrics
---------------------------
Loss:     2.213
Accuracy: 56.548%

CNN2

Το μοντέλο CNN2 που ορίζουμε αποτελείται από συνολικά 4 συνελικτικά (convolutional) επίπεδα, εκ των οποίων το πρώτο είναι 32 φίλτρων διάστασης 3x3, το δεύτερο είναι 64 φίλτρων 3x3, το τρίτο είναι 128 φίτλρων 3x3 και το τέταρτο είναι 256 φίλτρων 3x3. Και τα τέσσερα ενεργοποιούνται μέσω συναρτήσεων ReLU ενώ πραγματοποιείται padding στις αρχικές εικόνες, ώστε να διατηρήσουν τις διαστάσεις τους καθώς διέρχονται μέσα από το συνελικτικό επίπεδο. Μετά από τα τρία πρώτα Convolutional layers υπάρχει ένα επίπεδο υποδειγματοληψίας τύπου MaxPooling 2x2 για μείωση της διαστατικότητας με παράλληλη διατήρηση της χρήσιμης πληροφορίας. Στο τέλος βρίσκονται 3 Fully Connected επίπεδα, από τα οποία το πρώτο περιλαμβάνει 512 νευρώνες, το δεύτερο 128 ενώ το τρίτο και τελευταίο (output layer) περιλαμβάνει πλήθος νευρώνων ίσο με τον αριθμό των εκάστοτε κλάσεων που ορίζουμε. Για το προτελευταίο layer επιλέγουμε ως συνάρτηση ενεργοποίησης μια ReLU ενώ το output layer διαθέτει μια συνάρτηση ενεργοποίησης (activation fucntion) softmax για την κανονικοποίηση των τιμών στο εύρος [0,1] (πιθανότητες).

In [ ]:
def init_cnn2_model(summary, optimizer = tf.optimizers.Adam, lr = 0.00005):
  model = models.Sequential()
  model.add(layers.Conv2D(32, (3, 3), activation='relu',padding="same", input_shape=(32, 32, 3))) 
  model.add(layers.MaxPooling2D((2, 2)))
  model.add(layers.Conv2D(64, (3, 3), activation='relu',padding="same"))
  model.add(layers.MaxPooling2D((2, 2)))
  model.add(layers.Conv2D(128, (3, 3), activation='relu',padding="same"))
  model.add(layers.MaxPooling2D((2, 2)))
  model.add(layers.Conv2D(256, (3, 3), activation='relu',padding="same"))
  model.add(layers.Flatten())
  model.add(layers.Dense(512,activation='relu'))
  model.add(layers.Dense(128,activation='relu'))
  model.add(layers.Dense(CLASSES_NUM,activation='softmax'))

  model.compile(optimizer=optimizer(learning_rate = lr), loss=tf.keras.losses.sparse_categorical_crossentropy, metrics=["accuracy"])
  if summary: 
    model.summary()
  return model
In [ ]:
CNN2_MODEL = init_cnn2_model(summary = True)
tf.keras.utils.plot_model(CNN2_MODEL, to_file='model.png', show_shapes=True, show_layer_names=False,rankdir='LR', expand_nested=False, dpi=80)
Model: "sequential_2"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
conv2d_6 (Conv2D)            (None, 32, 32, 32)        896       
_________________________________________________________________
max_pooling2d_4 (MaxPooling2 (None, 16, 16, 32)        0         
_________________________________________________________________
conv2d_7 (Conv2D)            (None, 16, 16, 64)        18496     
_________________________________________________________________
max_pooling2d_5 (MaxPooling2 (None, 8, 8, 64)          0         
_________________________________________________________________
conv2d_8 (Conv2D)            (None, 8, 8, 128)         73856     
_________________________________________________________________
max_pooling2d_6 (MaxPooling2 (None, 4, 4, 128)         0         
_________________________________________________________________
conv2d_9 (Conv2D)            (None, 4, 4, 256)         295168    
_________________________________________________________________
flatten_2 (Flatten)          (None, 4096)              0         
_________________________________________________________________
dense_4 (Dense)              (None, 512)               2097664   
_________________________________________________________________
dense_5 (Dense)              (None, 128)               65664     
_________________________________________________________________
dense_6 (Dense)              (None, 20)                2580      
=================================================================
Total params: 2,554,324
Trainable params: 2,554,324
Non-trainable params: 0
_________________________________________________________________
Out[ ]:
In [ ]:
CNN2_MODEL_history = train_model(CNN2_MODEL)
Epoch 1/100
266/266 [==============================] - 2s 5ms/step - loss: 2.8961 - accuracy: 0.1100 - val_loss: 2.4397 - val_accuracy: 0.2480
Epoch 2/100
266/266 [==============================] - 1s 4ms/step - loss: 2.3937 - accuracy: 0.2793 - val_loss: 2.2038 - val_accuracy: 0.3298
Epoch 3/100
266/266 [==============================] - 1s 5ms/step - loss: 2.1638 - accuracy: 0.3319 - val_loss: 2.0351 - val_accuracy: 0.3677
Epoch 4/100
266/266 [==============================] - 1s 5ms/step - loss: 1.9969 - accuracy: 0.3915 - val_loss: 1.9372 - val_accuracy: 0.4009
Epoch 5/100
266/266 [==============================] - 1s 5ms/step - loss: 1.8741 - accuracy: 0.4367 - val_loss: 1.8745 - val_accuracy: 0.4315
Epoch 6/100
266/266 [==============================] - 1s 5ms/step - loss: 1.7463 - accuracy: 0.4692 - val_loss: 1.7926 - val_accuracy: 0.4588
Epoch 7/100
266/266 [==============================] - 1s 5ms/step - loss: 1.6397 - accuracy: 0.5079 - val_loss: 1.7361 - val_accuracy: 0.4707
Epoch 8/100
266/266 [==============================] - 1s 4ms/step - loss: 1.5890 - accuracy: 0.5153 - val_loss: 1.6744 - val_accuracy: 0.5093
Epoch 9/100
266/266 [==============================] - 1s 5ms/step - loss: 1.5074 - accuracy: 0.5409 - val_loss: 1.6416 - val_accuracy: 0.5027
Epoch 10/100
266/266 [==============================] - 1s 5ms/step - loss: 1.4366 - accuracy: 0.5549 - val_loss: 1.6068 - val_accuracy: 0.5266
Epoch 11/100
266/266 [==============================] - 1s 5ms/step - loss: 1.3995 - accuracy: 0.5825 - val_loss: 1.5912 - val_accuracy: 0.5366
Epoch 12/100
266/266 [==============================] - 1s 5ms/step - loss: 1.3085 - accuracy: 0.6005 - val_loss: 1.5779 - val_accuracy: 0.5293
Epoch 13/100
266/266 [==============================] - 1s 5ms/step - loss: 1.2297 - accuracy: 0.6203 - val_loss: 1.6037 - val_accuracy: 0.5160
Epoch 14/100
266/266 [==============================] - 1s 5ms/step - loss: 1.1885 - accuracy: 0.6478 - val_loss: 1.5163 - val_accuracy: 0.5525
Epoch 15/100
266/266 [==============================] - 1s 5ms/step - loss: 1.1017 - accuracy: 0.6646 - val_loss: 1.4998 - val_accuracy: 0.5618
Epoch 16/100
266/266 [==============================] - 1s 5ms/step - loss: 1.0984 - accuracy: 0.6548 - val_loss: 1.5031 - val_accuracy: 0.5658
Epoch 17/100
266/266 [==============================] - 1s 5ms/step - loss: 1.0187 - accuracy: 0.6886 - val_loss: 1.4859 - val_accuracy: 0.5665
Epoch 18/100
266/266 [==============================] - 1s 5ms/step - loss: 0.9652 - accuracy: 0.7022 - val_loss: 1.4663 - val_accuracy: 0.5718
Epoch 19/100
266/266 [==============================] - 1s 5ms/step - loss: 0.9328 - accuracy: 0.7120 - val_loss: 1.5516 - val_accuracy: 0.5698
Epoch 20/100
266/266 [==============================] - 1s 5ms/step - loss: 0.8793 - accuracy: 0.7275 - val_loss: 1.4611 - val_accuracy: 0.5858
Epoch 21/100
266/266 [==============================] - 1s 5ms/step - loss: 0.8175 - accuracy: 0.7510 - val_loss: 1.4979 - val_accuracy: 0.5924
Epoch 22/100
266/266 [==============================] - 1s 5ms/step - loss: 0.7652 - accuracy: 0.7622 - val_loss: 1.4760 - val_accuracy: 0.5824
Epoch 23/100
266/266 [==============================] - 1s 5ms/step - loss: 0.7007 - accuracy: 0.7803 - val_loss: 1.5479 - val_accuracy: 0.5785
Epoch 24/100
266/266 [==============================] - 1s 5ms/step - loss: 0.6699 - accuracy: 0.7946 - val_loss: 1.5284 - val_accuracy: 0.5884
Epoch 25/100
266/266 [==============================] - 1s 5ms/step - loss: 0.5954 - accuracy: 0.8210 - val_loss: 1.6279 - val_accuracy: 0.5738
Epoch 26/100
266/266 [==============================] - 1s 5ms/step - loss: 0.5596 - accuracy: 0.8336 - val_loss: 1.6090 - val_accuracy: 0.5864
Epoch 27/100
266/266 [==============================] - 1s 5ms/step - loss: 0.4966 - accuracy: 0.8541 - val_loss: 1.5934 - val_accuracy: 0.5957
Epoch 28/100
266/266 [==============================] - 1s 5ms/step - loss: 0.4408 - accuracy: 0.8776 - val_loss: 1.6420 - val_accuracy: 0.5918
Epoch 29/100
266/266 [==============================] - 1s 4ms/step - loss: 0.4303 - accuracy: 0.8711 - val_loss: 1.6588 - val_accuracy: 0.5844
Epoch 30/100
266/266 [==============================] - 1s 5ms/step - loss: 0.3678 - accuracy: 0.8982 - val_loss: 1.7123 - val_accuracy: 0.5824
Epoch 31/100
266/266 [==============================] - 1s 5ms/step - loss: 0.3140 - accuracy: 0.9126 - val_loss: 1.7954 - val_accuracy: 0.5891
Epoch 32/100
266/266 [==============================] - 1s 5ms/step - loss: 0.2704 - accuracy: 0.9255 - val_loss: 1.7847 - val_accuracy: 0.5957
Epoch 33/100
266/266 [==============================] - 1s 5ms/step - loss: 0.2487 - accuracy: 0.9345 - val_loss: 1.9020 - val_accuracy: 0.5785
Epoch 34/100
266/266 [==============================] - 1s 5ms/step - loss: 0.2258 - accuracy: 0.9422 - val_loss: 1.9086 - val_accuracy: 0.5765
Epoch 35/100
266/266 [==============================] - 1s 5ms/step - loss: 0.1855 - accuracy: 0.9549 - val_loss: 1.9985 - val_accuracy: 0.5791
Epoch 36/100
266/266 [==============================] - 1s 5ms/step - loss: 0.1735 - accuracy: 0.9583 - val_loss: 1.9792 - val_accuracy: 0.5957
Epoch 37/100
266/266 [==============================] - 1s 5ms/step - loss: 0.1623 - accuracy: 0.9569 - val_loss: 2.1055 - val_accuracy: 0.5858
Epoch 38/100
266/266 [==============================] - 1s 5ms/step - loss: 0.1289 - accuracy: 0.9669 - val_loss: 2.1685 - val_accuracy: 0.5831
Epoch 39/100
266/266 [==============================] - 1s 5ms/step - loss: 0.1156 - accuracy: 0.9719 - val_loss: 2.1616 - val_accuracy: 0.6017
Epoch 40/100
266/266 [==============================] - 1s 5ms/step - loss: 0.0843 - accuracy: 0.9850 - val_loss: 2.2179 - val_accuracy: 0.5944
Epoch 41/100
266/266 [==============================] - 1s 5ms/step - loss: 0.0862 - accuracy: 0.9811 - val_loss: 2.2631 - val_accuracy: 0.6077
Epoch 42/100
266/266 [==============================] - 1s 5ms/step - loss: 0.0564 - accuracy: 0.9916 - val_loss: 2.3799 - val_accuracy: 0.5911
Epoch 43/100
266/266 [==============================] - 1s 5ms/step - loss: 0.0816 - accuracy: 0.9826 - val_loss: 2.4353 - val_accuracy: 0.5838
Epoch 44/100
266/266 [==============================] - 1s 5ms/step - loss: 0.0663 - accuracy: 0.9865 - val_loss: 2.5037 - val_accuracy: 0.5864
Epoch 45/100
266/266 [==============================] - 1s 5ms/step - loss: 0.0664 - accuracy: 0.9852 - val_loss: 2.5099 - val_accuracy: 0.5944
Epoch 46/100
266/266 [==============================] - 1s 5ms/step - loss: 0.0580 - accuracy: 0.9900 - val_loss: 2.5291 - val_accuracy: 0.5878
Epoch 47/100
266/266 [==============================] - 1s 5ms/step - loss: 0.0248 - accuracy: 0.9989 - val_loss: 2.6419 - val_accuracy: 0.5964
Epoch 48/100
266/266 [==============================] - 1s 5ms/step - loss: 0.0232 - accuracy: 0.9978 - val_loss: 2.7901 - val_accuracy: 0.5805
Epoch 49/100
266/266 [==============================] - 1s 5ms/step - loss: 0.0359 - accuracy: 0.9934 - val_loss: 2.5725 - val_accuracy: 0.5718
Epoch 50/100
266/266 [==============================] - 1s 5ms/step - loss: 0.1326 - accuracy: 0.9577 - val_loss: 2.5094 - val_accuracy: 0.5984
Epoch 51/100
266/266 [==============================] - 1s 5ms/step - loss: 0.0276 - accuracy: 0.9967 - val_loss: 2.6931 - val_accuracy: 0.5977
Epoch 52/100
266/266 [==============================] - 1s 5ms/step - loss: 0.0093 - accuracy: 0.9998 - val_loss: 2.7547 - val_accuracy: 0.6031
Epoch 53/100
266/266 [==============================] - 1s 5ms/step - loss: 0.0049 - accuracy: 1.0000 - val_loss: 2.8168 - val_accuracy: 0.6051
Epoch 54/100
266/266 [==============================] - 1s 5ms/step - loss: 0.0034 - accuracy: 1.0000 - val_loss: 2.8986 - val_accuracy: 0.6011
Epoch 55/100
266/266 [==============================] - 1s 5ms/step - loss: 0.0029 - accuracy: 1.0000 - val_loss: 2.9798 - val_accuracy: 0.6070
Epoch 56/100
266/266 [==============================] - 1s 5ms/step - loss: 0.0025 - accuracy: 1.0000 - val_loss: 3.0465 - val_accuracy: 0.6024
Epoch 57/100
266/266 [==============================] - 1s 5ms/step - loss: 0.0022 - accuracy: 1.0000 - val_loss: 3.1003 - val_accuracy: 0.6037
Epoch 58/100
266/266 [==============================] - 1s 5ms/step - loss: 0.0091 - accuracy: 0.9980 - val_loss: 2.4885 - val_accuracy: 0.5180
Epoch 59/100
266/266 [==============================] - 1s 5ms/step - loss: 0.3282 - accuracy: 0.8930 - val_loss: 2.6810 - val_accuracy: 0.5632
Epoch 60/100
266/266 [==============================] - 1s 5ms/step - loss: 0.0468 - accuracy: 0.9883 - val_loss: 2.6733 - val_accuracy: 0.5997
Epoch 61/100
266/266 [==============================] - 1s 5ms/step - loss: 0.0117 - accuracy: 0.9988 - val_loss: 2.8143 - val_accuracy: 0.6084
Epoch 62/100
266/266 [==============================] - 1s 5ms/step - loss: 0.0187 - accuracy: 0.9959 - val_loss: 2.8250 - val_accuracy: 0.5805
Epoch 63/100
266/266 [==============================] - 1s 5ms/step - loss: 0.1126 - accuracy: 0.9641 - val_loss: 2.6831 - val_accuracy: 0.5944
Epoch 64/100
266/266 [==============================] - 1s 4ms/step - loss: 0.0130 - accuracy: 0.9991 - val_loss: 2.8234 - val_accuracy: 0.6031
Epoch 65/100
266/266 [==============================] - 1s 5ms/step - loss: 0.0033 - accuracy: 1.0000 - val_loss: 2.8964 - val_accuracy: 0.6097
Epoch 66/100
266/266 [==============================] - 1s 5ms/step - loss: 0.0021 - accuracy: 1.0000 - val_loss: 2.9693 - val_accuracy: 0.6070
Epoch 67/100
266/266 [==============================] - 1s 5ms/step - loss: 0.0017 - accuracy: 1.0000 - val_loss: 3.0683 - val_accuracy: 0.6070
Epoch 68/100
266/266 [==============================] - 1s 5ms/step - loss: 0.0013 - accuracy: 1.0000 - val_loss: 3.1020 - val_accuracy: 0.6057
Epoch 69/100
266/266 [==============================] - 1s 5ms/step - loss: 0.0011 - accuracy: 1.0000 - val_loss: 3.1656 - val_accuracy: 0.6064
Epoch 70/100
266/266 [==============================] - 1s 5ms/step - loss: 0.0010 - accuracy: 1.0000 - val_loss: 3.1994 - val_accuracy: 0.6051
Epoch 71/100
266/266 [==============================] - 1s 5ms/step - loss: 8.9055e-04 - accuracy: 1.0000 - val_loss: 3.2553 - val_accuracy: 0.6051
Epoch 72/100
266/266 [==============================] - 1s 5ms/step - loss: 7.4988e-04 - accuracy: 1.0000 - val_loss: 3.3089 - val_accuracy: 0.6064
Epoch 73/100
266/266 [==============================] - 1s 5ms/step - loss: 6.8227e-04 - accuracy: 1.0000 - val_loss: 3.3433 - val_accuracy: 0.6044
Epoch 74/100
266/266 [==============================] - 1s 5ms/step - loss: 5.9609e-04 - accuracy: 1.0000 - val_loss: 3.4078 - val_accuracy: 0.6044
Epoch 75/100
266/266 [==============================] - 1s 5ms/step - loss: 5.2395e-04 - accuracy: 1.0000 - val_loss: 3.4386 - val_accuracy: 0.6064
Epoch 76/100
266/266 [==============================] - 1s 5ms/step - loss: 5.0531e-04 - accuracy: 1.0000 - val_loss: 3.4834 - val_accuracy: 0.6064
Epoch 77/100
266/266 [==============================] - 1s 5ms/step - loss: 4.2616e-04 - accuracy: 1.0000 - val_loss: 3.5234 - val_accuracy: 0.6037
Epoch 78/100
266/266 [==============================] - 1s 5ms/step - loss: 3.6666e-04 - accuracy: 1.0000 - val_loss: 3.6023 - val_accuracy: 0.6024
Epoch 79/100
266/266 [==============================] - 1s 5ms/step - loss: 3.6294e-04 - accuracy: 1.0000 - val_loss: 3.6068 - val_accuracy: 0.6070
Epoch 80/100
266/266 [==============================] - 1s 5ms/step - loss: 2.9417e-04 - accuracy: 1.0000 - val_loss: 3.6651 - val_accuracy: 0.6024
Epoch 81/100
266/266 [==============================] - 1s 5ms/step - loss: 0.2981 - accuracy: 0.9202 - val_loss: 2.5123 - val_accuracy: 0.5765
Epoch 82/100
266/266 [==============================] - 1s 5ms/step - loss: 0.0874 - accuracy: 0.9770 - val_loss: 2.6754 - val_accuracy: 0.5871
Epoch 83/100
266/266 [==============================] - 1s 5ms/step - loss: 0.0159 - accuracy: 0.9975 - val_loss: 2.8912 - val_accuracy: 0.6017
Epoch 84/100
266/266 [==============================] - 1s 5ms/step - loss: 0.0029 - accuracy: 1.0000 - val_loss: 3.0059 - val_accuracy: 0.5951
Epoch 85/100
266/266 [==============================] - 1s 5ms/step - loss: 0.0133 - accuracy: 0.9968 - val_loss: 3.1520 - val_accuracy: 0.5592
Epoch 86/100
266/266 [==============================] - 1s 5ms/step - loss: 0.0910 - accuracy: 0.9718 - val_loss: 2.8750 - val_accuracy: 0.5851
Epoch 87/100
266/266 [==============================] - 1s 5ms/step - loss: 0.0093 - accuracy: 0.9989 - val_loss: 3.0203 - val_accuracy: 0.5997
Epoch 88/100
266/266 [==============================] - 1s 5ms/step - loss: 0.0019 - accuracy: 1.0000 - val_loss: 3.1102 - val_accuracy: 0.6037
Epoch 89/100
266/266 [==============================] - 1s 5ms/step - loss: 0.0012 - accuracy: 1.0000 - val_loss: 3.1907 - val_accuracy: 0.6070
Epoch 90/100
266/266 [==============================] - 1s 5ms/step - loss: 8.5670e-04 - accuracy: 1.0000 - val_loss: 3.2754 - val_accuracy: 0.6097
Epoch 91/100
266/266 [==============================] - 1s 5ms/step - loss: 7.5319e-04 - accuracy: 1.0000 - val_loss: 3.3140 - val_accuracy: 0.6077
Epoch 92/100
266/266 [==============================] - 1s 5ms/step - loss: 6.3639e-04 - accuracy: 1.0000 - val_loss: 3.3763 - val_accuracy: 0.6077
Epoch 93/100
266/266 [==============================] - 1s 5ms/step - loss: 5.1207e-04 - accuracy: 1.0000 - val_loss: 3.4308 - val_accuracy: 0.6064
Epoch 94/100
266/266 [==============================] - 1s 5ms/step - loss: 4.6051e-04 - accuracy: 1.0000 - val_loss: 3.4968 - val_accuracy: 0.6031
Epoch 95/100
266/266 [==============================] - 1s 5ms/step - loss: 3.9492e-04 - accuracy: 1.0000 - val_loss: 3.5254 - val_accuracy: 0.6031
Epoch 96/100
266/266 [==============================] - 1s 5ms/step - loss: 3.3183e-04 - accuracy: 1.0000 - val_loss: 3.5677 - val_accuracy: 0.6037
Epoch 97/100
266/266 [==============================] - 1s 5ms/step - loss: 3.1117e-04 - accuracy: 1.0000 - val_loss: 3.6059 - val_accuracy: 0.6090
Epoch 98/100
266/266 [==============================] - 1s 5ms/step - loss: 2.6555e-04 - accuracy: 1.0000 - val_loss: 3.6377 - val_accuracy: 0.6070
Epoch 99/100
266/266 [==============================] - 1s 5ms/step - loss: 2.3192e-04 - accuracy: 1.0000 - val_loss: 3.6867 - val_accuracy: 0.6104
Epoch 100/100
266/266 [==============================] - 1s 5ms/step - loss: 2.0149e-04 - accuracy: 1.0000 - val_loss: 3.7255 - val_accuracy: 0.6117
In [ ]:
loss, accuracy = model_report(CNN2_MODEL, CNN2_MODEL_history)
losses["CNN2"] = loss
accuracies["CNN2"] = accuracy
Test set evaluation metrics
---------------------------
Loss:     3.686
Accuracy: 59.276%

Μεταφορά μάθησης

Προκειμένου να λάβουμε υψηλότερα ποσοστά ορθής κατηγοριοποίησης κάνουμε χρήση μεταφοράς μάθησης. Συγκεκριμένα, αξιοποιούμε τα δίκτυα VGG16, MobileNet και DenseNet, τα οποία είναι προεκπαιδευμένα πάνω στο ImageNet. Για κάθε ένα από αυτά, δοκιμάζουμε να εκπαιδεύσουμε αρχικά μόνο την κεφαλή ταξινόμησης, κρατώντας παγωμένα όλα τα συνελικτικά επίπεδα. Στη συνέχεια, επιχειρούμε να εκπαιδεύσουμε τόσο τον classifier όσο και ορισμένα συνελικτικά επίπεδα που βρίσκονται προς την έξοδο του δικτύου. Τέλος, κάνουμε unfreeze όλο το μοντέλο και κάνουμε train όλα τα επίπεδά του.

VGG16

Το πρώτο μοντέλο που εξετάζουμε είναι το VGG16. Πρόκειται για ένα CNN που προτάθηκε από τους K. Simonyan, A. Zisserman και το οποίο πετυχαίνει accuracy 92.7% στο ImageNet dataset. Το μοντέλο αυτό βελτιώνει το AlexNet αντικαθιστώντας τα μεγάλα kernel-scaled φίλτρα (11 και 5 στο πρώτο και δεύτερο συνελικτικό επίπεδο αντίστοιχα) με πολλαπλά φίλτρα μεγέθους 3 × 3 το ένα μετά το άλλο. Η αρχιτεκτονική του φαίνεται στην ακόλουθη εικόνα:

vgg
Εκπαίδευση μόνο της κεφαλής ταξινόμησης
In [ ]:
# transfer learning: VGG16 trained on ImageNet without the top layer

def init_VGG16_model(summary, optimizer = tf.optimizers.Adam, lr = 0.00005):
  vgg_model=tf.keras.applications.VGG16(input_shape=(32,32,3), include_top=False, weights='imagenet')
  
  VGG16_MODEL=vgg_model.layers[0](vgg_model)

  # freeze conv layers
  VGG16_MODEL.trainable=False
  
  dropout_layer = tf.keras.layers.Dropout(rate = 0.5)
  global_average_layer = tf.keras.layers.GlobalAveragePooling2D()

  # add top layer for CIFAR100 classification
  prediction_layer = tf.keras.layers.Dense(CLASSES_NUM,activation='softmax')
  model = tf.keras.Sequential([VGG16_MODEL, dropout_layer, global_average_layer, prediction_layer])
  model.compile(optimizer=optimizer(learning_rate = lr), loss=tf.keras.losses.sparse_categorical_crossentropy, metrics=["accuracy"])
  if summary: 
    model.summary()
  return model
In [ ]:
VGG16_MODEL = init_VGG16_model(True)
VGG16_MODEL_history = train_model(VGG16_MODEL)
Downloading data from https://storage.googleapis.com/tensorflow/keras-applications/vgg16/vgg16_weights_tf_dim_ordering_tf_kernels_notop.h5
58892288/58889256 [==============================] - 0s 0us/step
Model: "sequential_3"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
vgg16 (Functional)           (None, 1, 1, 512)         14714688  
_________________________________________________________________
dropout (Dropout)            (None, 1, 1, 512)         0         
_________________________________________________________________
global_average_pooling2d (Gl (None, 512)               0         
_________________________________________________________________
dense_7 (Dense)              (None, 20)                10260     
=================================================================
Total params: 14,724,948
Trainable params: 10,260
Non-trainable params: 14,714,688
_________________________________________________________________
Epoch 1/100
266/266 [==============================] - 4s 10ms/step - loss: 3.4611 - accuracy: 0.0450 - val_loss: 2.9177 - val_accuracy: 0.1137
Epoch 2/100
266/266 [==============================] - 3s 10ms/step - loss: 3.0544 - accuracy: 0.0874 - val_loss: 2.6964 - val_accuracy: 0.2035
Epoch 3/100
266/266 [==============================] - 3s 10ms/step - loss: 2.8389 - accuracy: 0.1372 - val_loss: 2.5381 - val_accuracy: 0.2779
Epoch 4/100
266/266 [==============================] - 3s 10ms/step - loss: 2.6766 - accuracy: 0.1940 - val_loss: 2.4159 - val_accuracy: 0.3318
Epoch 5/100
266/266 [==============================] - 3s 10ms/step - loss: 2.5381 - accuracy: 0.2287 - val_loss: 2.3154 - val_accuracy: 0.3763
Epoch 6/100
266/266 [==============================] - 3s 10ms/step - loss: 2.4436 - accuracy: 0.2601 - val_loss: 2.2355 - val_accuracy: 0.4003
Epoch 7/100
266/266 [==============================] - 3s 10ms/step - loss: 2.3609 - accuracy: 0.2846 - val_loss: 2.1665 - val_accuracy: 0.4162
Epoch 8/100
266/266 [==============================] - 3s 10ms/step - loss: 2.2817 - accuracy: 0.3118 - val_loss: 2.1096 - val_accuracy: 0.4322
Epoch 9/100
266/266 [==============================] - 3s 10ms/step - loss: 2.2266 - accuracy: 0.3282 - val_loss: 2.0609 - val_accuracy: 0.4435
Epoch 10/100
266/266 [==============================] - 3s 10ms/step - loss: 2.1800 - accuracy: 0.3489 - val_loss: 2.0200 - val_accuracy: 0.4574
Epoch 11/100
266/266 [==============================] - 3s 10ms/step - loss: 2.1306 - accuracy: 0.3696 - val_loss: 1.9847 - val_accuracy: 0.4668
Epoch 12/100
266/266 [==============================] - 3s 10ms/step - loss: 2.0888 - accuracy: 0.3774 - val_loss: 1.9531 - val_accuracy: 0.4674
Epoch 13/100
266/266 [==============================] - 3s 10ms/step - loss: 2.0656 - accuracy: 0.3816 - val_loss: 1.9244 - val_accuracy: 0.4734
Epoch 14/100
266/266 [==============================] - 3s 10ms/step - loss: 2.0287 - accuracy: 0.4038 - val_loss: 1.8991 - val_accuracy: 0.4721
Epoch 15/100
266/266 [==============================] - 3s 10ms/step - loss: 2.0081 - accuracy: 0.4100 - val_loss: 1.8772 - val_accuracy: 0.4781
Epoch 16/100
266/266 [==============================] - 3s 10ms/step - loss: 1.9952 - accuracy: 0.4036 - val_loss: 1.8548 - val_accuracy: 0.4847
Epoch 17/100
266/266 [==============================] - 3s 10ms/step - loss: 1.9772 - accuracy: 0.4066 - val_loss: 1.8390 - val_accuracy: 0.4834
Epoch 18/100
266/266 [==============================] - 3s 10ms/step - loss: 1.9471 - accuracy: 0.4187 - val_loss: 1.8236 - val_accuracy: 0.4940
Epoch 19/100
266/266 [==============================] - 3s 10ms/step - loss: 1.9129 - accuracy: 0.4295 - val_loss: 1.8079 - val_accuracy: 0.4980
Epoch 20/100
266/266 [==============================] - 3s 10ms/step - loss: 1.9167 - accuracy: 0.4318 - val_loss: 1.7947 - val_accuracy: 0.5033
Epoch 21/100
266/266 [==============================] - 3s 10ms/step - loss: 1.8958 - accuracy: 0.4363 - val_loss: 1.7793 - val_accuracy: 0.5013
Epoch 22/100
266/266 [==============================] - 3s 10ms/step - loss: 1.8677 - accuracy: 0.4423 - val_loss: 1.7672 - val_accuracy: 0.5100
Epoch 23/100
266/266 [==============================] - 3s 10ms/step - loss: 1.8724 - accuracy: 0.4460 - val_loss: 1.7578 - val_accuracy: 0.5093
Epoch 24/100
266/266 [==============================] - 3s 10ms/step - loss: 1.8645 - accuracy: 0.4340 - val_loss: 1.7479 - val_accuracy: 0.5066
Epoch 25/100
266/266 [==============================] - 3s 10ms/step - loss: 1.8248 - accuracy: 0.4580 - val_loss: 1.7401 - val_accuracy: 0.5113
Epoch 26/100
266/266 [==============================] - 3s 10ms/step - loss: 1.8332 - accuracy: 0.4507 - val_loss: 1.7269 - val_accuracy: 0.5206
Epoch 27/100
266/266 [==============================] - 3s 10ms/step - loss: 1.8282 - accuracy: 0.4500 - val_loss: 1.7178 - val_accuracy: 0.5186
Epoch 28/100
266/266 [==============================] - 3s 10ms/step - loss: 1.7957 - accuracy: 0.4684 - val_loss: 1.7103 - val_accuracy: 0.5199
Epoch 29/100
266/266 [==============================] - 3s 10ms/step - loss: 1.8103 - accuracy: 0.4532 - val_loss: 1.7067 - val_accuracy: 0.5166
Epoch 30/100
266/266 [==============================] - 3s 10ms/step - loss: 1.8085 - accuracy: 0.4598 - val_loss: 1.6979 - val_accuracy: 0.5233
Epoch 31/100
266/266 [==============================] - 3s 10ms/step - loss: 1.7918 - accuracy: 0.4544 - val_loss: 1.6943 - val_accuracy: 0.5233
Epoch 32/100
266/266 [==============================] - 3s 10ms/step - loss: 1.7518 - accuracy: 0.4716 - val_loss: 1.6850 - val_accuracy: 0.5273
Epoch 33/100
266/266 [==============================] - 3s 10ms/step - loss: 1.7422 - accuracy: 0.4742 - val_loss: 1.6824 - val_accuracy: 0.5306
Epoch 34/100
266/266 [==============================] - 3s 10ms/step - loss: 1.7600 - accuracy: 0.4683 - val_loss: 1.6738 - val_accuracy: 0.5273
Epoch 35/100
266/266 [==============================] - 3s 10ms/step - loss: 1.7606 - accuracy: 0.4611 - val_loss: 1.6695 - val_accuracy: 0.5273
Epoch 36/100
266/266 [==============================] - 3s 10ms/step - loss: 1.7378 - accuracy: 0.4688 - val_loss: 1.6649 - val_accuracy: 0.5266
Epoch 37/100
266/266 [==============================] - 3s 10ms/step - loss: 1.7355 - accuracy: 0.4763 - val_loss: 1.6570 - val_accuracy: 0.5266
Epoch 38/100
266/266 [==============================] - 3s 10ms/step - loss: 1.7404 - accuracy: 0.4740 - val_loss: 1.6557 - val_accuracy: 0.5233
Epoch 39/100
266/266 [==============================] - 3s 10ms/step - loss: 1.7199 - accuracy: 0.4747 - val_loss: 1.6482 - val_accuracy: 0.5306
Epoch 40/100
266/266 [==============================] - 3s 10ms/step - loss: 1.7184 - accuracy: 0.4839 - val_loss: 1.6439 - val_accuracy: 0.5306
Epoch 41/100
266/266 [==============================] - 3s 10ms/step - loss: 1.6984 - accuracy: 0.4856 - val_loss: 1.6425 - val_accuracy: 0.5273
Epoch 42/100
266/266 [==============================] - 3s 10ms/step - loss: 1.7230 - accuracy: 0.4812 - val_loss: 1.6381 - val_accuracy: 0.5346
Epoch 43/100
266/266 [==============================] - 3s 10ms/step - loss: 1.6938 - accuracy: 0.4910 - val_loss: 1.6324 - val_accuracy: 0.5326
Epoch 44/100
266/266 [==============================] - 3s 10ms/step - loss: 1.7037 - accuracy: 0.4808 - val_loss: 1.6276 - val_accuracy: 0.5359
Epoch 45/100
266/266 [==============================] - 3s 10ms/step - loss: 1.6948 - accuracy: 0.4894 - val_loss: 1.6238 - val_accuracy: 0.5332
Epoch 46/100
266/266 [==============================] - 3s 10ms/step - loss: 1.7024 - accuracy: 0.4809 - val_loss: 1.6210 - val_accuracy: 0.5372
Epoch 47/100
266/266 [==============================] - 3s 10ms/step - loss: 1.6949 - accuracy: 0.4906 - val_loss: 1.6213 - val_accuracy: 0.5386
Epoch 48/100
266/266 [==============================] - 3s 10ms/step - loss: 1.6879 - accuracy: 0.4813 - val_loss: 1.6179 - val_accuracy: 0.5372
Epoch 49/100
266/266 [==============================] - 3s 10ms/step - loss: 1.6872 - accuracy: 0.4824 - val_loss: 1.6124 - val_accuracy: 0.5386
Epoch 50/100
266/266 [==============================] - 3s 10ms/step - loss: 1.6559 - accuracy: 0.5040 - val_loss: 1.6087 - val_accuracy: 0.5392
Epoch 51/100
266/266 [==============================] - 3s 10ms/step - loss: 1.6591 - accuracy: 0.4939 - val_loss: 1.6072 - val_accuracy: 0.5366
Epoch 52/100
266/266 [==============================] - 3s 10ms/step - loss: 1.6664 - accuracy: 0.4978 - val_loss: 1.6034 - val_accuracy: 0.5399
Epoch 53/100
266/266 [==============================] - 3s 10ms/step - loss: 1.6646 - accuracy: 0.4889 - val_loss: 1.6030 - val_accuracy: 0.5359
Epoch 54/100
266/266 [==============================] - 3s 10ms/step - loss: 1.6470 - accuracy: 0.5030 - val_loss: 1.6022 - val_accuracy: 0.5366
Epoch 55/100
266/266 [==============================] - 3s 10ms/step - loss: 1.6703 - accuracy: 0.4873 - val_loss: 1.6001 - val_accuracy: 0.5426
Epoch 56/100
266/266 [==============================] - 3s 10ms/step - loss: 1.6727 - accuracy: 0.4908 - val_loss: 1.5938 - val_accuracy: 0.5412
Epoch 57/100
266/266 [==============================] - 3s 10ms/step - loss: 1.6439 - accuracy: 0.4991 - val_loss: 1.5953 - val_accuracy: 0.5406
Epoch 58/100
266/266 [==============================] - 3s 10ms/step - loss: 1.6612 - accuracy: 0.5025 - val_loss: 1.5922 - val_accuracy: 0.5399
Epoch 59/100
266/266 [==============================] - 3s 10ms/step - loss: 1.6494 - accuracy: 0.5011 - val_loss: 1.5885 - val_accuracy: 0.5332
Epoch 60/100
266/266 [==============================] - 3s 10ms/step - loss: 1.6221 - accuracy: 0.5037 - val_loss: 1.5888 - val_accuracy: 0.5386
Epoch 61/100
266/266 [==============================] - 3s 10ms/step - loss: 1.6355 - accuracy: 0.5016 - val_loss: 1.5856 - val_accuracy: 0.5426
Epoch 62/100
266/266 [==============================] - 3s 10ms/step - loss: 1.6318 - accuracy: 0.4975 - val_loss: 1.5841 - val_accuracy: 0.5406
Epoch 63/100
266/266 [==============================] - 3s 10ms/step - loss: 1.6352 - accuracy: 0.5025 - val_loss: 1.5820 - val_accuracy: 0.5452
Epoch 64/100
266/266 [==============================] - 3s 10ms/step - loss: 1.6342 - accuracy: 0.5105 - val_loss: 1.5845 - val_accuracy: 0.5372
Epoch 65/100
266/266 [==============================] - 3s 10ms/step - loss: 1.6377 - accuracy: 0.4968 - val_loss: 1.5799 - val_accuracy: 0.5412
Epoch 66/100
266/266 [==============================] - 3s 10ms/step - loss: 1.6377 - accuracy: 0.5044 - val_loss: 1.5765 - val_accuracy: 0.5426
Epoch 67/100
266/266 [==============================] - 3s 10ms/step - loss: 1.6108 - accuracy: 0.5064 - val_loss: 1.5723 - val_accuracy: 0.5445
Epoch 68/100
266/266 [==============================] - 3s 10ms/step - loss: 1.6299 - accuracy: 0.5116 - val_loss: 1.5741 - val_accuracy: 0.5432
Epoch 69/100
266/266 [==============================] - 3s 10ms/step - loss: 1.6315 - accuracy: 0.5039 - val_loss: 1.5749 - val_accuracy: 0.5439
Epoch 70/100
266/266 [==============================] - 3s 10ms/step - loss: 1.6147 - accuracy: 0.5088 - val_loss: 1.5676 - val_accuracy: 0.5426
Epoch 71/100
266/266 [==============================] - 3s 10ms/step - loss: 1.6223 - accuracy: 0.4981 - val_loss: 1.5648 - val_accuracy: 0.5399
Epoch 72/100
266/266 [==============================] - 3s 10ms/step - loss: 1.6057 - accuracy: 0.5042 - val_loss: 1.5691 - val_accuracy: 0.5412
Epoch 73/100
266/266 [==============================] - 3s 10ms/step - loss: 1.5908 - accuracy: 0.5162 - val_loss: 1.5619 - val_accuracy: 0.5459
Epoch 74/100
266/266 [==============================] - 3s 10ms/step - loss: 1.6245 - accuracy: 0.5110 - val_loss: 1.5649 - val_accuracy: 0.5392
Epoch 75/100
266/266 [==============================] - 3s 10ms/step - loss: 1.6332 - accuracy: 0.4970 - val_loss: 1.5622 - val_accuracy: 0.5412
Epoch 76/100
266/266 [==============================] - 3s 10ms/step - loss: 1.6316 - accuracy: 0.4998 - val_loss: 1.5655 - val_accuracy: 0.5366
Epoch 77/100
266/266 [==============================] - 3s 10ms/step - loss: 1.6264 - accuracy: 0.4955 - val_loss: 1.5559 - val_accuracy: 0.5439
Epoch 78/100
266/266 [==============================] - 3s 10ms/step - loss: 1.5774 - accuracy: 0.5146 - val_loss: 1.5587 - val_accuracy: 0.5392
Epoch 79/100
266/266 [==============================] - 3s 10ms/step - loss: 1.6246 - accuracy: 0.5054 - val_loss: 1.5582 - val_accuracy: 0.5419
Epoch 80/100
266/266 [==============================] - 3s 10ms/step - loss: 1.6125 - accuracy: 0.5104 - val_loss: 1.5597 - val_accuracy: 0.5406
Epoch 81/100
266/266 [==============================] - 3s 10ms/step - loss: 1.6115 - accuracy: 0.5103 - val_loss: 1.5533 - val_accuracy: 0.5439
Epoch 82/100
266/266 [==============================] - 3s 10ms/step - loss: 1.6130 - accuracy: 0.5072 - val_loss: 1.5523 - val_accuracy: 0.5399
Epoch 83/100
266/266 [==============================] - 3s 10ms/step - loss: 1.5779 - accuracy: 0.5114 - val_loss: 1.5519 - val_accuracy: 0.5439
Epoch 84/100
266/266 [==============================] - 3s 10ms/step - loss: 1.5821 - accuracy: 0.5179 - val_loss: 1.5493 - val_accuracy: 0.5432
Epoch 85/100
266/266 [==============================] - 3s 10ms/step - loss: 1.6133 - accuracy: 0.5134 - val_loss: 1.5477 - val_accuracy: 0.5445
Epoch 86/100
266/266 [==============================] - 3s 10ms/step - loss: 1.6039 - accuracy: 0.5077 - val_loss: 1.5474 - val_accuracy: 0.5432
Epoch 87/100
266/266 [==============================] - 3s 10ms/step - loss: 1.6037 - accuracy: 0.5120 - val_loss: 1.5490 - val_accuracy: 0.5426
Epoch 88/100
266/266 [==============================] - 3s 10ms/step - loss: 1.5872 - accuracy: 0.5191 - val_loss: 1.5470 - val_accuracy: 0.5432
Epoch 89/100
266/266 [==============================] - 3s 10ms/step - loss: 1.5872 - accuracy: 0.5122 - val_loss: 1.5444 - val_accuracy: 0.5472
Epoch 90/100
266/266 [==============================] - 3s 10ms/step - loss: 1.5767 - accuracy: 0.5182 - val_loss: 1.5448 - val_accuracy: 0.5465
Epoch 91/100
266/266 [==============================] - 3s 10ms/step - loss: 1.5877 - accuracy: 0.5091 - val_loss: 1.5419 - val_accuracy: 0.5452
Epoch 92/100
266/266 [==============================] - 3s 10ms/step - loss: 1.5859 - accuracy: 0.5176 - val_loss: 1.5414 - val_accuracy: 0.5492
Epoch 93/100
266/266 [==============================] - 3s 10ms/step - loss: 1.5701 - accuracy: 0.5143 - val_loss: 1.5411 - val_accuracy: 0.5472
Epoch 94/100
266/266 [==============================] - 3s 10ms/step - loss: 1.5757 - accuracy: 0.5180 - val_loss: 1.5369 - val_accuracy: 0.5459
Epoch 95/100
266/266 [==============================] - 3s 10ms/step - loss: 1.5991 - accuracy: 0.5097 - val_loss: 1.5410 - val_accuracy: 0.5459
Epoch 96/100
266/266 [==============================] - 3s 10ms/step - loss: 1.5687 - accuracy: 0.5222 - val_loss: 1.5398 - val_accuracy: 0.5445
Epoch 97/100
266/266 [==============================] - 3s 10ms/step - loss: 1.5565 - accuracy: 0.5225 - val_loss: 1.5362 - val_accuracy: 0.5472
Epoch 98/100
266/266 [==============================] - 3s 10ms/step - loss: 1.5812 - accuracy: 0.5130 - val_loss: 1.5332 - val_accuracy: 0.5492
Epoch 99/100
266/266 [==============================] - 3s 10ms/step - loss: 1.5971 - accuracy: 0.5092 - val_loss: 1.5323 - val_accuracy: 0.5512
Epoch 100/100
266/266 [==============================] - 3s 10ms/step - loss: 1.5856 - accuracy: 0.5137 - val_loss: 1.5340 - val_accuracy: 0.5492
In [ ]:
loss, accuracy = model_report(VGG16_MODEL, VGG16_MODEL_history)
accuracies["VGG_NONE"] = accuracy
Test set evaluation metrics
---------------------------
Loss:     1.565
Accuracy: 53.175%
Εκπαίδευση κεφαλής ταξινόμησης και ορισμένων συνελικτικών επιπέδων που βρίσκονται κοντά σε αυτή
In [ ]:
# transfer learning: VGG16 trained on ImageNet without the top layer

def init_VGG16_model(summary, optimizer = tf.optimizers.Adam, lr = 0.00005):
  vgg_model=tf.keras.applications.VGG16(input_shape=(32,32,3), include_top=False, weights='imagenet')
  
  VGG16_MODEL=vgg_model.layers[0](vgg_model)

  for layer in VGG16_MODEL.layers[:15]:
      layer.trainable=False
  for layer in VGG16_MODEL.layers[15:]:
      layer.trainable=True

  dropout_layer = tf.keras.layers.Dropout(rate = 0.5)
  global_average_layer = tf.keras.layers.GlobalAveragePooling2D()

  # add top layer for CIFAR100 classification
  prediction_layer = tf.keras.layers.Dense(CLASSES_NUM,activation='softmax')
  model = tf.keras.Sequential([VGG16_MODEL, dropout_layer, global_average_layer, prediction_layer])
  model.compile(optimizer=optimizer(learning_rate = lr), loss=tf.keras.losses.sparse_categorical_crossentropy, metrics=["accuracy"])
  if summary: 
    model.summary()
  return model
In [ ]:
VGG16_MODEL = init_VGG16_model(True)
VGG16_MODEL_history = train_model(VGG16_MODEL)
Model: "sequential_4"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
vgg16 (Functional)           (None, 1, 1, 512)         14714688  
_________________________________________________________________
dropout_1 (Dropout)          (None, 1, 1, 512)         0         
_________________________________________________________________
global_average_pooling2d_1 ( (None, 512)               0         
_________________________________________________________________
dense_8 (Dense)              (None, 20)                10260     
=================================================================
Total params: 14,724,948
Trainable params: 7,089,684
Non-trainable params: 7,635,264
_________________________________________________________________
Epoch 1/100
266/266 [==============================] - 5s 17ms/step - loss: 2.4057 - accuracy: 0.2897 - val_loss: 1.3783 - val_accuracy: 0.5944
Epoch 2/100
266/266 [==============================] - 4s 16ms/step - loss: 1.3858 - accuracy: 0.5767 - val_loss: 1.2520 - val_accuracy: 0.6243
Epoch 3/100
266/266 [==============================] - 4s 16ms/step - loss: 1.0527 - accuracy: 0.6762 - val_loss: 1.1564 - val_accuracy: 0.6669
Epoch 4/100
266/266 [==============================] - 4s 16ms/step - loss: 0.8847 - accuracy: 0.7256 - val_loss: 1.1376 - val_accuracy: 0.6596
Epoch 5/100
266/266 [==============================] - 4s 16ms/step - loss: 0.6732 - accuracy: 0.7895 - val_loss: 1.0690 - val_accuracy: 0.6895
Epoch 6/100
266/266 [==============================] - 4s 16ms/step - loss: 0.5099 - accuracy: 0.8372 - val_loss: 1.1363 - val_accuracy: 0.6868
Epoch 7/100
266/266 [==============================] - 4s 16ms/step - loss: 0.3768 - accuracy: 0.8872 - val_loss: 1.1394 - val_accuracy: 0.6888
Epoch 8/100
266/266 [==============================] - 4s 16ms/step - loss: 0.2729 - accuracy: 0.9145 - val_loss: 1.3602 - val_accuracy: 0.6589
Epoch 9/100
266/266 [==============================] - 4s 16ms/step - loss: 0.2001 - accuracy: 0.9384 - val_loss: 1.3026 - val_accuracy: 0.6888
Epoch 10/100
266/266 [==============================] - 4s 16ms/step - loss: 0.1223 - accuracy: 0.9646 - val_loss: 1.4012 - val_accuracy: 0.7035
Epoch 11/100
266/266 [==============================] - 4s 16ms/step - loss: 0.0938 - accuracy: 0.9730 - val_loss: 1.4097 - val_accuracy: 0.6928
Epoch 12/100
266/266 [==============================] - 4s 16ms/step - loss: 0.0737 - accuracy: 0.9776 - val_loss: 1.5989 - val_accuracy: 0.6782
Epoch 13/100
266/266 [==============================] - 4s 17ms/step - loss: 0.0559 - accuracy: 0.9829 - val_loss: 1.5296 - val_accuracy: 0.6895
Epoch 14/100
266/266 [==============================] - 4s 17ms/step - loss: 0.0714 - accuracy: 0.9780 - val_loss: 1.5739 - val_accuracy: 0.6955
Epoch 15/100
266/266 [==============================] - 4s 16ms/step - loss: 0.0422 - accuracy: 0.9897 - val_loss: 1.6807 - val_accuracy: 0.6988
Epoch 16/100
266/266 [==============================] - 4s 16ms/step - loss: 0.0527 - accuracy: 0.9847 - val_loss: 1.8692 - val_accuracy: 0.6815
Epoch 17/100
266/266 [==============================] - 4s 16ms/step - loss: 0.0682 - accuracy: 0.9781 - val_loss: 1.7178 - val_accuracy: 0.6822
Epoch 18/100
266/266 [==============================] - 4s 16ms/step - loss: 0.0368 - accuracy: 0.9889 - val_loss: 1.9548 - val_accuracy: 0.6669
Epoch 19/100
266/266 [==============================] - 4s 16ms/step - loss: 0.0696 - accuracy: 0.9784 - val_loss: 1.5931 - val_accuracy: 0.6882
Epoch 20/100
266/266 [==============================] - 4s 16ms/step - loss: 0.0275 - accuracy: 0.9930 - val_loss: 1.8660 - val_accuracy: 0.6948
Epoch 21/100
266/266 [==============================] - 4s 16ms/step - loss: 0.0157 - accuracy: 0.9950 - val_loss: 1.9296 - val_accuracy: 0.6715
Epoch 22/100
266/266 [==============================] - 4s 16ms/step - loss: 0.0442 - accuracy: 0.9848 - val_loss: 1.9910 - val_accuracy: 0.6975
Epoch 23/100
266/266 [==============================] - 4s 16ms/step - loss: 0.0259 - accuracy: 0.9917 - val_loss: 1.8299 - val_accuracy: 0.6981
Epoch 24/100
266/266 [==============================] - 4s 16ms/step - loss: 0.0351 - accuracy: 0.9895 - val_loss: 2.0795 - val_accuracy: 0.6789
Epoch 25/100
266/266 [==============================] - 4s 16ms/step - loss: 0.0470 - accuracy: 0.9855 - val_loss: 1.9180 - val_accuracy: 0.6888
Epoch 26/100
266/266 [==============================] - 4s 17ms/step - loss: 0.0461 - accuracy: 0.9868 - val_loss: 2.0693 - val_accuracy: 0.6662
Epoch 27/100
266/266 [==============================] - 4s 16ms/step - loss: 0.0364 - accuracy: 0.9904 - val_loss: 2.1169 - val_accuracy: 0.6795
Epoch 28/100
266/266 [==============================] - 4s 16ms/step - loss: 0.0303 - accuracy: 0.9902 - val_loss: 1.9735 - val_accuracy: 0.6975
Epoch 29/100
266/266 [==============================] - 4s 17ms/step - loss: 0.0192 - accuracy: 0.9949 - val_loss: 2.0287 - val_accuracy: 0.6822
Epoch 30/100
266/266 [==============================] - 4s 16ms/step - loss: 0.0367 - accuracy: 0.9886 - val_loss: 2.2569 - val_accuracy: 0.6715
Epoch 31/100
266/266 [==============================] - 4s 16ms/step - loss: 0.0195 - accuracy: 0.9930 - val_loss: 2.2923 - val_accuracy: 0.6689
Epoch 32/100
266/266 [==============================] - 4s 16ms/step - loss: 0.0527 - accuracy: 0.9829 - val_loss: 2.0920 - val_accuracy: 0.6722
Epoch 33/100
266/266 [==============================] - 4s 16ms/step - loss: 0.0284 - accuracy: 0.9907 - val_loss: 2.0540 - val_accuracy: 0.7055
Epoch 34/100
266/266 [==============================] - 4s 16ms/step - loss: 0.0166 - accuracy: 0.9956 - val_loss: 2.1629 - val_accuracy: 0.6941
Epoch 35/100
266/266 [==============================] - 4s 16ms/step - loss: 0.0414 - accuracy: 0.9885 - val_loss: 2.3146 - val_accuracy: 0.6855
Epoch 36/100
266/266 [==============================] - 4s 16ms/step - loss: 0.0235 - accuracy: 0.9926 - val_loss: 2.0301 - val_accuracy: 0.7008
Epoch 37/100
266/266 [==============================] - 4s 16ms/step - loss: 0.0335 - accuracy: 0.9874 - val_loss: 2.0846 - val_accuracy: 0.6895
Epoch 38/100
266/266 [==============================] - 4s 16ms/step - loss: 0.0246 - accuracy: 0.9935 - val_loss: 2.1656 - val_accuracy: 0.6795
Epoch 39/100
266/266 [==============================] - 4s 16ms/step - loss: 0.0452 - accuracy: 0.9859 - val_loss: 1.8677 - val_accuracy: 0.6975
Epoch 40/100
266/266 [==============================] - 4s 16ms/step - loss: 0.0121 - accuracy: 0.9970 - val_loss: 2.1502 - val_accuracy: 0.6908
Epoch 41/100
266/266 [==============================] - 4s 17ms/step - loss: 0.0062 - accuracy: 0.9984 - val_loss: 2.3169 - val_accuracy: 0.6809
Epoch 42/100
266/266 [==============================] - 4s 16ms/step - loss: 0.0169 - accuracy: 0.9961 - val_loss: 2.2405 - val_accuracy: 0.6902
Epoch 43/100
266/266 [==============================] - 4s 17ms/step - loss: 0.0469 - accuracy: 0.9849 - val_loss: 2.2657 - val_accuracy: 0.6815
Epoch 44/100
266/266 [==============================] - 4s 16ms/step - loss: 0.0337 - accuracy: 0.9902 - val_loss: 2.0699 - val_accuracy: 0.7048
Epoch 45/100
266/266 [==============================] - 4s 16ms/step - loss: 0.0166 - accuracy: 0.9956 - val_loss: 2.0735 - val_accuracy: 0.6961
Epoch 46/100
266/266 [==============================] - 4s 16ms/step - loss: 0.0087 - accuracy: 0.9994 - val_loss: 2.1397 - val_accuracy: 0.6981
Epoch 47/100
266/266 [==============================] - 4s 17ms/step - loss: 0.0023 - accuracy: 0.9993 - val_loss: 2.1941 - val_accuracy: 0.6948
Epoch 48/100
266/266 [==============================] - 4s 16ms/step - loss: 2.5107e-04 - accuracy: 1.0000 - val_loss: 2.1926 - val_accuracy: 0.7041
Epoch 49/100
266/266 [==============================] - 4s 16ms/step - loss: 1.4364e-04 - accuracy: 1.0000 - val_loss: 2.2176 - val_accuracy: 0.7094
Epoch 50/100
266/266 [==============================] - 4s 16ms/step - loss: 9.9832e-05 - accuracy: 1.0000 - val_loss: 2.2318 - val_accuracy: 0.7088
Epoch 51/100
266/266 [==============================] - 4s 16ms/step - loss: 7.7396e-05 - accuracy: 1.0000 - val_loss: 2.2444 - val_accuracy: 0.7114
Epoch 52/100
266/266 [==============================] - 4s 16ms/step - loss: 6.8163e-05 - accuracy: 1.0000 - val_loss: 2.2918 - val_accuracy: 0.7074
Epoch 53/100
266/266 [==============================] - 4s 16ms/step - loss: 1.7030e-04 - accuracy: 1.0000 - val_loss: 2.3555 - val_accuracy: 0.7035
Epoch 54/100
266/266 [==============================] - 4s 17ms/step - loss: 6.9933e-05 - accuracy: 1.0000 - val_loss: 2.3631 - val_accuracy: 0.7055
Epoch 55/100
266/266 [==============================] - 4s 16ms/step - loss: 3.8994e-05 - accuracy: 1.0000 - val_loss: 2.3698 - val_accuracy: 0.7088
Epoch 56/100
266/266 [==============================] - 4s 17ms/step - loss: 4.3110e-05 - accuracy: 1.0000 - val_loss: 2.3725 - val_accuracy: 0.7048
Epoch 57/100
266/266 [==============================] - 4s 16ms/step - loss: 2.5091e-05 - accuracy: 1.0000 - val_loss: 2.3910 - val_accuracy: 0.7035
Epoch 58/100
266/266 [==============================] - 4s 16ms/step - loss: 2.4305e-05 - accuracy: 1.0000 - val_loss: 2.4228 - val_accuracy: 0.7061
Epoch 59/100
266/266 [==============================] - 4s 16ms/step - loss: 3.1526e-05 - accuracy: 1.0000 - val_loss: 2.4305 - val_accuracy: 0.7068
Epoch 60/100
266/266 [==============================] - 4s 16ms/step - loss: 1.8390e-05 - accuracy: 1.0000 - val_loss: 2.4603 - val_accuracy: 0.7055
Epoch 61/100
266/266 [==============================] - 4s 16ms/step - loss: 1.8263e-05 - accuracy: 1.0000 - val_loss: 2.4731 - val_accuracy: 0.7074
Epoch 62/100
266/266 [==============================] - 4s 16ms/step - loss: 1.1938e-05 - accuracy: 1.0000 - val_loss: 2.5060 - val_accuracy: 0.7074
Epoch 63/100
266/266 [==============================] - 4s 17ms/step - loss: 1.1993e-05 - accuracy: 1.0000 - val_loss: 2.5446 - val_accuracy: 0.7068
Epoch 64/100
266/266 [==============================] - 4s 16ms/step - loss: 1.4046e-05 - accuracy: 1.0000 - val_loss: 2.5444 - val_accuracy: 0.7048
Epoch 65/100
266/266 [==============================] - 4s 16ms/step - loss: 0.0090 - accuracy: 0.9979 - val_loss: 2.0275 - val_accuracy: 0.6290
Epoch 66/100
266/266 [==============================] - 4s 16ms/step - loss: 0.2789 - accuracy: 0.9226 - val_loss: 2.0570 - val_accuracy: 0.6762
Epoch 67/100
266/266 [==============================] - 4s 17ms/step - loss: 0.0565 - accuracy: 0.9831 - val_loss: 2.0855 - val_accuracy: 0.6882
Epoch 68/100
266/266 [==============================] - 4s 16ms/step - loss: 0.0036 - accuracy: 0.9994 - val_loss: 2.0854 - val_accuracy: 0.6928
Epoch 69/100
266/266 [==============================] - 4s 16ms/step - loss: 5.5019e-04 - accuracy: 1.0000 - val_loss: 2.1577 - val_accuracy: 0.6928
Epoch 70/100
266/266 [==============================] - 4s 16ms/step - loss: 4.5745e-04 - accuracy: 0.9999 - val_loss: 2.2325 - val_accuracy: 0.6908
Epoch 71/100
266/266 [==============================] - 4s 17ms/step - loss: 0.0031 - accuracy: 0.9993 - val_loss: 2.3355 - val_accuracy: 0.6742
Epoch 72/100
266/266 [==============================] - 4s 16ms/step - loss: 0.0674 - accuracy: 0.9824 - val_loss: 2.1641 - val_accuracy: 0.6835
Epoch 73/100
266/266 [==============================] - 4s 16ms/step - loss: 0.0276 - accuracy: 0.9924 - val_loss: 2.3732 - val_accuracy: 0.6955
Epoch 74/100
266/266 [==============================] - 4s 16ms/step - loss: 0.0450 - accuracy: 0.9874 - val_loss: 2.3616 - val_accuracy: 0.6775
Epoch 75/100
266/266 [==============================] - 4s 16ms/step - loss: 0.0313 - accuracy: 0.9940 - val_loss: 2.4391 - val_accuracy: 0.6835
Epoch 76/100
266/266 [==============================] - 4s 16ms/step - loss: 0.0256 - accuracy: 0.9944 - val_loss: 2.4002 - val_accuracy: 0.6941
Epoch 77/100
266/266 [==============================] - 4s 16ms/step - loss: 0.0197 - accuracy: 0.9959 - val_loss: 2.5597 - val_accuracy: 0.6775
Epoch 78/100
266/266 [==============================] - 4s 16ms/step - loss: 0.0183 - accuracy: 0.9942 - val_loss: 2.4406 - val_accuracy: 0.6802
Epoch 79/100
266/266 [==============================] - 4s 16ms/step - loss: 0.0235 - accuracy: 0.9925 - val_loss: 2.4968 - val_accuracy: 0.6888
Epoch 80/100
266/266 [==============================] - 4s 16ms/step - loss: 0.0216 - accuracy: 0.9946 - val_loss: 2.7190 - val_accuracy: 0.6702
Epoch 81/100
266/266 [==============================] - 4s 16ms/step - loss: 0.0401 - accuracy: 0.9886 - val_loss: 2.6675 - val_accuracy: 0.6755
Epoch 82/100
266/266 [==============================] - 4s 16ms/step - loss: 0.0158 - accuracy: 0.9956 - val_loss: 2.7273 - val_accuracy: 0.6749
Epoch 83/100
266/266 [==============================] - 4s 16ms/step - loss: 0.0102 - accuracy: 0.9974 - val_loss: 2.4812 - val_accuracy: 0.6915
Epoch 84/100
266/266 [==============================] - 4s 16ms/step - loss: 0.0155 - accuracy: 0.9966 - val_loss: 2.6530 - val_accuracy: 0.6742
Epoch 85/100
266/266 [==============================] - 4s 17ms/step - loss: 0.0192 - accuracy: 0.9939 - val_loss: 2.5271 - val_accuracy: 0.6888
Epoch 86/100
266/266 [==============================] - 4s 16ms/step - loss: 0.0212 - accuracy: 0.9945 - val_loss: 2.8264 - val_accuracy: 0.6782
Epoch 87/100
266/266 [==============================] - 4s 17ms/step - loss: 0.0244 - accuracy: 0.9922 - val_loss: 2.7522 - val_accuracy: 0.6815
Epoch 88/100
266/266 [==============================] - 4s 16ms/step - loss: 0.0107 - accuracy: 0.9981 - val_loss: 2.8051 - val_accuracy: 0.6769
Epoch 89/100
266/266 [==============================] - 4s 16ms/step - loss: 0.0169 - accuracy: 0.9963 - val_loss: 2.4820 - val_accuracy: 0.6888
Epoch 90/100
266/266 [==============================] - 4s 16ms/step - loss: 0.0127 - accuracy: 0.9958 - val_loss: 2.4209 - val_accuracy: 0.6955
Epoch 91/100
266/266 [==============================] - 4s 16ms/step - loss: 0.0085 - accuracy: 0.9973 - val_loss: 2.7497 - val_accuracy: 0.6882
Epoch 92/100
266/266 [==============================] - 4s 16ms/step - loss: 0.0221 - accuracy: 0.9956 - val_loss: 2.5250 - val_accuracy: 0.6855
Epoch 93/100
266/266 [==============================] - 4s 16ms/step - loss: 0.0211 - accuracy: 0.9937 - val_loss: 2.7970 - val_accuracy: 0.6775
Epoch 94/100
266/266 [==============================] - 4s 16ms/step - loss: 0.0115 - accuracy: 0.9961 - val_loss: 2.6126 - val_accuracy: 0.6895
Epoch 95/100
266/266 [==============================] - 4s 16ms/step - loss: 0.0027 - accuracy: 0.9990 - val_loss: 2.8440 - val_accuracy: 0.6702
Epoch 96/100
266/266 [==============================] - 4s 16ms/step - loss: 0.0347 - accuracy: 0.9913 - val_loss: 2.7721 - val_accuracy: 0.6722
Epoch 97/100
266/266 [==============================] - 4s 16ms/step - loss: 0.0162 - accuracy: 0.9952 - val_loss: 2.5361 - val_accuracy: 0.6895
Epoch 98/100
266/266 [==============================] - 4s 16ms/step - loss: 0.0101 - accuracy: 0.9975 - val_loss: 3.0399 - val_accuracy: 0.6536
Epoch 99/100
266/266 [==============================] - 4s 16ms/step - loss: 0.0191 - accuracy: 0.9941 - val_loss: 2.6167 - val_accuracy: 0.6888
Epoch 100/100
266/266 [==============================] - 4s 16ms/step - loss: 0.0023 - accuracy: 0.9990 - val_loss: 2.6105 - val_accuracy: 0.7015
In [ ]:
loss, accuracy = model_report(VGG16_MODEL, VGG16_MODEL_history)
accuracies["VGG_FEW"] = accuracy
Test set evaluation metrics
---------------------------
Loss:     2.569
Accuracy: 68.056%
Εκπαίδευση ολόκληρου του δικτύου
In [ ]:
# transfer learning: VGG16 trained on ImageNet without the top layer

def init_VGG16_model(summary, optimizer = tf.optimizers.Adam, lr = 0.00005):
  vgg_model=tf.keras.applications.VGG16(input_shape=(32,32,3), include_top=False, weights='imagenet')
  
  VGG16_MODEL=vgg_model.layers[0](vgg_model)

  # unfreeze conv layers
  VGG16_MODEL.trainable=True
  
  dropout_layer = tf.keras.layers.Dropout(rate = 0.5)
  global_average_layer = tf.keras.layers.GlobalAveragePooling2D()

  # add top layer for CIFAR100 classification
  prediction_layer = tf.keras.layers.Dense(CLASSES_NUM,activation='softmax')
  model = tf.keras.Sequential([VGG16_MODEL, dropout_layer, global_average_layer, prediction_layer])
  model.compile(optimizer=optimizer(learning_rate = lr), loss=tf.keras.losses.sparse_categorical_crossentropy, metrics=["accuracy"])
  if summary: 
    model.summary()
  return model
In [ ]:
VGG16_MODEL = init_VGG16_model(True)
VGG16_MODEL_history = train_model(VGG16_MODEL)
Model: "sequential_11"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
vgg16 (Functional)           (None, 1, 1, 512)         14714688  
_________________________________________________________________
dropout_8 (Dropout)          (None, 1, 1, 512)         0         
_________________________________________________________________
global_average_pooling2d_8 ( (None, 512)               0         
_________________________________________________________________
dense_15 (Dense)             (None, 20)                10260     
=================================================================
Total params: 14,724,948
Trainable params: 14,724,948
Non-trainable params: 0
_________________________________________________________________
Epoch 1/100
266/266 [==============================] - 9s 30ms/step - loss: 2.8183 - accuracy: 0.1473 - val_loss: 1.7259 - val_accuracy: 0.4867
Epoch 2/100
266/266 [==============================] - 8s 29ms/step - loss: 1.6768 - accuracy: 0.5114 - val_loss: 1.1543 - val_accuracy: 0.6589
Epoch 3/100
266/266 [==============================] - 8s 29ms/step - loss: 1.1183 - accuracy: 0.6842 - val_loss: 1.0612 - val_accuracy: 0.6868
Epoch 4/100
266/266 [==============================] - 8s 29ms/step - loss: 0.8053 - accuracy: 0.7636 - val_loss: 0.9649 - val_accuracy: 0.7261
Epoch 5/100
266/266 [==============================] - 8s 29ms/step - loss: 0.5984 - accuracy: 0.8289 - val_loss: 0.9805 - val_accuracy: 0.7354
Epoch 6/100
266/266 [==============================] - 8s 30ms/step - loss: 0.4386 - accuracy: 0.8759 - val_loss: 0.9714 - val_accuracy: 0.7493
Epoch 7/100
266/266 [==============================] - 8s 30ms/step - loss: 0.2826 - accuracy: 0.9182 - val_loss: 0.9113 - val_accuracy: 0.7686
Epoch 8/100
266/266 [==============================] - 8s 30ms/step - loss: 0.2085 - accuracy: 0.9353 - val_loss: 1.1152 - val_accuracy: 0.7487
Epoch 9/100
266/266 [==============================] - 8s 30ms/step - loss: 0.1502 - accuracy: 0.9563 - val_loss: 1.2002 - val_accuracy: 0.7267
Epoch 10/100
266/266 [==============================] - 8s 30ms/step - loss: 0.1361 - accuracy: 0.9629 - val_loss: 1.2354 - val_accuracy: 0.7434
Epoch 11/100
266/266 [==============================] - 8s 30ms/step - loss: 0.1552 - accuracy: 0.9560 - val_loss: 1.2743 - val_accuracy: 0.7394
Epoch 12/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0981 - accuracy: 0.9721 - val_loss: 1.1506 - val_accuracy: 0.7533
Epoch 13/100
266/266 [==============================] - 8s 30ms/step - loss: 0.1127 - accuracy: 0.9665 - val_loss: 1.1891 - val_accuracy: 0.7507
Epoch 14/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0614 - accuracy: 0.9842 - val_loss: 1.4919 - val_accuracy: 0.7420
Epoch 15/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0897 - accuracy: 0.9718 - val_loss: 1.2866 - val_accuracy: 0.7473
Epoch 16/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0514 - accuracy: 0.9852 - val_loss: 1.2647 - val_accuracy: 0.7473
Epoch 17/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0786 - accuracy: 0.9775 - val_loss: 1.0812 - val_accuracy: 0.7646
Epoch 18/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0664 - accuracy: 0.9817 - val_loss: 1.2612 - val_accuracy: 0.7520
Epoch 19/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0520 - accuracy: 0.9845 - val_loss: 1.0959 - val_accuracy: 0.7719
Epoch 20/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0273 - accuracy: 0.9931 - val_loss: 1.2347 - val_accuracy: 0.7347
Epoch 21/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0513 - accuracy: 0.9866 - val_loss: 1.2949 - val_accuracy: 0.7566
Epoch 22/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0436 - accuracy: 0.9886 - val_loss: 1.3701 - val_accuracy: 0.7547
Epoch 23/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0676 - accuracy: 0.9796 - val_loss: 1.2908 - val_accuracy: 0.7467
Epoch 24/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0622 - accuracy: 0.9839 - val_loss: 1.3822 - val_accuracy: 0.7473
Epoch 25/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0734 - accuracy: 0.9815 - val_loss: 1.1562 - val_accuracy: 0.7739
Epoch 26/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0241 - accuracy: 0.9929 - val_loss: 1.3860 - val_accuracy: 0.7513
Epoch 27/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0481 - accuracy: 0.9887 - val_loss: 1.3983 - val_accuracy: 0.7586
Epoch 28/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0704 - accuracy: 0.9814 - val_loss: 1.4417 - val_accuracy: 0.7480
Epoch 29/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0212 - accuracy: 0.9930 - val_loss: 1.4071 - val_accuracy: 0.7467
Epoch 30/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0728 - accuracy: 0.9819 - val_loss: 1.3101 - val_accuracy: 0.7766
Epoch 31/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0501 - accuracy: 0.9885 - val_loss: 1.4332 - val_accuracy: 0.7540
Epoch 32/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0250 - accuracy: 0.9930 - val_loss: 1.3893 - val_accuracy: 0.7513
Epoch 33/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0606 - accuracy: 0.9838 - val_loss: 1.2292 - val_accuracy: 0.7660
Epoch 34/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0183 - accuracy: 0.9953 - val_loss: 1.3637 - val_accuracy: 0.7826
Epoch 35/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0403 - accuracy: 0.9902 - val_loss: 1.4881 - val_accuracy: 0.7493
Epoch 36/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0513 - accuracy: 0.9870 - val_loss: 1.2998 - val_accuracy: 0.7620
Epoch 37/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0315 - accuracy: 0.9911 - val_loss: 1.2314 - val_accuracy: 0.7766
Epoch 38/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0345 - accuracy: 0.9902 - val_loss: 1.3661 - val_accuracy: 0.7467
Epoch 39/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0365 - accuracy: 0.9902 - val_loss: 1.3209 - val_accuracy: 0.7600
Epoch 40/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0404 - accuracy: 0.9900 - val_loss: 1.3208 - val_accuracy: 0.7626
Epoch 41/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0385 - accuracy: 0.9918 - val_loss: 1.3427 - val_accuracy: 0.7620
Epoch 42/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0203 - accuracy: 0.9961 - val_loss: 1.5011 - val_accuracy: 0.7626
Epoch 43/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0536 - accuracy: 0.9863 - val_loss: 1.3367 - val_accuracy: 0.7773
Epoch 44/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0166 - accuracy: 0.9952 - val_loss: 1.5152 - val_accuracy: 0.7593
Epoch 45/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0359 - accuracy: 0.9905 - val_loss: 1.5962 - val_accuracy: 0.7214
Epoch 46/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0372 - accuracy: 0.9909 - val_loss: 1.4383 - val_accuracy: 0.7633
Epoch 47/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0075 - accuracy: 0.9984 - val_loss: 1.5224 - val_accuracy: 0.7374
Epoch 48/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0460 - accuracy: 0.9882 - val_loss: 1.4940 - val_accuracy: 0.7693
Epoch 49/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0445 - accuracy: 0.9898 - val_loss: 1.3268 - val_accuracy: 0.7753
Epoch 50/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0468 - accuracy: 0.9891 - val_loss: 1.4571 - val_accuracy: 0.7600
Epoch 51/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0374 - accuracy: 0.9919 - val_loss: 1.3406 - val_accuracy: 0.7613
Epoch 52/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0154 - accuracy: 0.9976 - val_loss: 1.4549 - val_accuracy: 0.7686
Epoch 53/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0282 - accuracy: 0.9922 - val_loss: 1.5441 - val_accuracy: 0.7314
Epoch 54/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0195 - accuracy: 0.9937 - val_loss: 1.5542 - val_accuracy: 0.7593
Epoch 55/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0279 - accuracy: 0.9930 - val_loss: 1.5543 - val_accuracy: 0.7301
Epoch 56/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0423 - accuracy: 0.9891 - val_loss: 1.4103 - val_accuracy: 0.7653
Epoch 57/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0145 - accuracy: 0.9958 - val_loss: 1.6069 - val_accuracy: 0.7467
Epoch 58/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0182 - accuracy: 0.9938 - val_loss: 1.4999 - val_accuracy: 0.7560
Epoch 59/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0325 - accuracy: 0.9917 - val_loss: 1.2900 - val_accuracy: 0.7719
Epoch 60/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0081 - accuracy: 0.9982 - val_loss: 1.3133 - val_accuracy: 0.7819
Epoch 61/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0079 - accuracy: 0.9978 - val_loss: 1.4265 - val_accuracy: 0.7699
Epoch 62/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0441 - accuracy: 0.9891 - val_loss: 1.5514 - val_accuracy: 0.7573
Epoch 63/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0462 - accuracy: 0.9882 - val_loss: 1.3832 - val_accuracy: 0.7673
Epoch 64/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0415 - accuracy: 0.9904 - val_loss: 1.3716 - val_accuracy: 0.7739
Epoch 65/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0141 - accuracy: 0.9967 - val_loss: 1.4848 - val_accuracy: 0.7739
Epoch 66/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0093 - accuracy: 0.9983 - val_loss: 1.4327 - val_accuracy: 0.7746
Epoch 67/100
266/266 [==============================] - 8s 30ms/step - loss: 4.3110e-04 - accuracy: 1.0000 - val_loss: 1.4141 - val_accuracy: 0.7866
Epoch 68/100
266/266 [==============================] - 8s 30ms/step - loss: 8.7518e-05 - accuracy: 1.0000 - val_loss: 1.4356 - val_accuracy: 0.7879
Epoch 69/100
266/266 [==============================] - 8s 30ms/step - loss: 5.1002e-05 - accuracy: 1.0000 - val_loss: 1.4586 - val_accuracy: 0.7879
Epoch 70/100
266/266 [==============================] - 8s 30ms/step - loss: 5.5004e-05 - accuracy: 1.0000 - val_loss: 1.4779 - val_accuracy: 0.7879
Epoch 71/100
266/266 [==============================] - 8s 30ms/step - loss: 3.6757e-05 - accuracy: 1.0000 - val_loss: 1.5009 - val_accuracy: 0.7872
Epoch 72/100
266/266 [==============================] - 8s 30ms/step - loss: 2.4479e-05 - accuracy: 1.0000 - val_loss: 1.5282 - val_accuracy: 0.7872
Epoch 73/100
266/266 [==============================] - 8s 30ms/step - loss: 1.8377e-05 - accuracy: 1.0000 - val_loss: 1.5750 - val_accuracy: 0.7886
Epoch 74/100
266/266 [==============================] - 8s 30ms/step - loss: 1.5262e-05 - accuracy: 1.0000 - val_loss: 1.6330 - val_accuracy: 0.7872
Epoch 75/100
266/266 [==============================] - 8s 30ms/step - loss: 1.2071e-05 - accuracy: 1.0000 - val_loss: 1.6836 - val_accuracy: 0.7872
Epoch 76/100
266/266 [==============================] - 8s 30ms/step - loss: 7.4567e-06 - accuracy: 1.0000 - val_loss: 1.7268 - val_accuracy: 0.7879
Epoch 77/100
266/266 [==============================] - 8s 30ms/step - loss: 4.7543e-06 - accuracy: 1.0000 - val_loss: 1.7742 - val_accuracy: 0.7886
Epoch 78/100
266/266 [==============================] - 8s 30ms/step - loss: 3.8517e-06 - accuracy: 1.0000 - val_loss: 1.8180 - val_accuracy: 0.7892
Epoch 79/100
266/266 [==============================] - 8s 30ms/step - loss: 3.9426e-06 - accuracy: 1.0000 - val_loss: 1.8634 - val_accuracy: 0.7892
Epoch 80/100
266/266 [==============================] - 8s 30ms/step - loss: 2.7084e-06 - accuracy: 1.0000 - val_loss: 1.9007 - val_accuracy: 0.7899
Epoch 81/100
266/266 [==============================] - 8s 30ms/step - loss: 2.7919e-06 - accuracy: 1.0000 - val_loss: 1.9434 - val_accuracy: 0.7899
Epoch 82/100
266/266 [==============================] - 8s 30ms/step - loss: 1.3312e-06 - accuracy: 1.0000 - val_loss: 1.9673 - val_accuracy: 0.7906
Epoch 83/100
266/266 [==============================] - 8s 30ms/step - loss: 1.3755e-06 - accuracy: 1.0000 - val_loss: 2.0142 - val_accuracy: 0.7892
Epoch 84/100
266/266 [==============================] - 8s 30ms/step - loss: 1.0281e-06 - accuracy: 1.0000 - val_loss: 2.0368 - val_accuracy: 0.7906
Epoch 85/100
266/266 [==============================] - 8s 30ms/step - loss: 1.6120e-06 - accuracy: 1.0000 - val_loss: 2.0713 - val_accuracy: 0.7919
Epoch 86/100
266/266 [==============================] - 8s 30ms/step - loss: 7.7033e-07 - accuracy: 1.0000 - val_loss: 2.0851 - val_accuracy: 0.7926
Epoch 87/100
266/266 [==============================] - 8s 30ms/step - loss: 5.1291e-07 - accuracy: 1.0000 - val_loss: 2.1125 - val_accuracy: 0.7912
Epoch 88/100
266/266 [==============================] - 8s 30ms/step - loss: 4.7041e-07 - accuracy: 1.0000 - val_loss: 2.1305 - val_accuracy: 0.7919
Epoch 89/100
266/266 [==============================] - 8s 30ms/step - loss: 5.2234e-07 - accuracy: 1.0000 - val_loss: 2.1475 - val_accuracy: 0.7919
Epoch 90/100
266/266 [==============================] - 8s 30ms/step - loss: 8.7090e-07 - accuracy: 1.0000 - val_loss: 2.1719 - val_accuracy: 0.7939
Epoch 91/100
266/266 [==============================] - 8s 30ms/step - loss: 4.7406e-07 - accuracy: 1.0000 - val_loss: 2.2102 - val_accuracy: 0.7932
Epoch 92/100
266/266 [==============================] - 8s 30ms/step - loss: 9.2655e-07 - accuracy: 1.0000 - val_loss: 2.2277 - val_accuracy: 0.7919
Epoch 93/100
266/266 [==============================] - 8s 30ms/step - loss: 2.9525e-07 - accuracy: 1.0000 - val_loss: 2.2627 - val_accuracy: 0.7912
Epoch 94/100
266/266 [==============================] - 8s 30ms/step - loss: 3.8300e-07 - accuracy: 1.0000 - val_loss: 2.2755 - val_accuracy: 0.7919
Epoch 95/100
266/266 [==============================] - 8s 30ms/step - loss: 2.1440e-07 - accuracy: 1.0000 - val_loss: 2.3057 - val_accuracy: 0.7912
Epoch 96/100
266/266 [==============================] - 8s 30ms/step - loss: 1.6759e-07 - accuracy: 1.0000 - val_loss: 2.3113 - val_accuracy: 0.7926
Epoch 97/100
266/266 [==============================] - 8s 30ms/step - loss: 1.2781e-07 - accuracy: 1.0000 - val_loss: 2.3176 - val_accuracy: 0.7926
Epoch 98/100
266/266 [==============================] - 8s 30ms/step - loss: 1.6538e-07 - accuracy: 1.0000 - val_loss: 2.3381 - val_accuracy: 0.7926
Epoch 99/100
266/266 [==============================] - 8s 30ms/step - loss: 2.3697e-07 - accuracy: 1.0000 - val_loss: 2.3554 - val_accuracy: 0.7912
Epoch 100/100
266/266 [==============================] - 8s 30ms/step - loss: 1.3288e-07 - accuracy: 1.0000 - val_loss: 2.3758 - val_accuracy: 0.7932
In [ ]:
loss, accuracy = model_report(VGG16_MODEL, VGG16_MODEL_history)
losses["VGG_ALL"] = loss
accuracies["VGG_ALL"] = accuracy
Test set evaluation metrics
---------------------------
Loss:     2.502
Accuracy: 78.720%

MobileNet

Στη συνέχεια εξετάζουμε το MobileNet. Πρόκειται για ένα συνελικτικό δίκτυο του οποίου η αποτελεσματικότητα οφείλεται στην αντικατάσταση των convolution blocks από depthwise separable convolution blocks, δηλαδή ένα depthwise ακολουθούμενο από ένα pointwise. Σε ένα depthwise συνελικτικό επίπεδο υπάρχει ένα φίλτρο για κάθε input channel και επομένως ο αριθμός των output channels ισούται με τον αριθμό των input channels. Αντιθέτως, ένα pointwise συνελικτικό επίπεδο έχει ένα φίλτρο για κάθε output channel.

Εκπαίδευση μόνο της κεφαλής ταξινόμησης
In [ ]:
# transfer learning: MobileNet trained on ImageNet without the top layer

def init_MobileNetV2_model(summary, optimizer = tf.optimizers.Adam, lr = 0.00005):
  mobilenetV2_model=tf.keras.applications.MobileNetV2(input_shape=(IMG_SIZE,IMG_SIZE,3), include_top=False, weights='imagenet')
  
  MobileNetV2_MODEL=mobilenetV2_model.layers[0](mobilenetV2_model)

  # freeze conv layers
  MobileNetV2_MODEL.trainable=False
  
  dropout_layer = tf.keras.layers.Dropout(rate = 0.5)
  global_average_layer = tf.keras.layers.GlobalAveragePooling2D()

  # add top layer for CIFAR100 classification
  prediction_layer = tf.keras.layers.Dense(CLASSES_NUM,activation='softmax')
  model = tf.keras.Sequential([MobileNetV2_MODEL, dropout_layer, global_average_layer, prediction_layer])
  model.compile(optimizer=optimizer(learning_rate=lr), loss=tf.keras.losses.sparse_categorical_crossentropy, metrics=["accuracy"])
  if summary: 
    model.summary()
  return model
In [ ]:
MobileNetV2_MODEL = init_MobileNetV2_model(True)
MobileNetV2_MODEL_history = train_model(MobileNetV2_MODEL, train_ds_res, validation_ds_res)
Downloading data from https://storage.googleapis.com/tensorflow/keras-applications/mobilenet_v2/mobilenet_v2_weights_tf_dim_ordering_tf_kernels_1.0_224_no_top.h5
9412608/9406464 [==============================] - 0s 0us/step
Model: "sequential_6"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
mobilenetv2_1.00_224 (Functi (None, 7, 7, 1280)        2257984   
_________________________________________________________________
dropout_3 (Dropout)          (None, 7, 7, 1280)        0         
_________________________________________________________________
global_average_pooling2d_3 ( (None, 1280)              0         
_________________________________________________________________
dense_10 (Dense)             (None, 20)                25620     
=================================================================
Total params: 2,283,604
Trainable params: 25,620
Non-trainable params: 2,257,984
_________________________________________________________________
Epoch 1/100
266/266 [==============================] - 20s 65ms/step - loss: 2.9378 - accuracy: 0.1384 - val_loss: 1.9172 - val_accuracy: 0.4867
Epoch 2/100
266/266 [==============================] - 17s 62ms/step - loss: 1.7971 - accuracy: 0.5207 - val_loss: 1.3908 - val_accuracy: 0.6150
Epoch 3/100
266/266 [==============================] - 17s 63ms/step - loss: 1.3514 - accuracy: 0.6280 - val_loss: 1.1768 - val_accuracy: 0.6649
Epoch 4/100
266/266 [==============================] - 17s 63ms/step - loss: 1.1237 - accuracy: 0.6916 - val_loss: 1.0415 - val_accuracy: 0.6968
Epoch 5/100
266/266 [==============================] - 17s 62ms/step - loss: 1.0062 - accuracy: 0.7170 - val_loss: 0.9673 - val_accuracy: 0.7174
Epoch 6/100
266/266 [==============================] - 17s 62ms/step - loss: 0.9085 - accuracy: 0.7428 - val_loss: 0.9065 - val_accuracy: 0.7360
Epoch 7/100
266/266 [==============================] - 17s 62ms/step - loss: 0.8813 - accuracy: 0.7390 - val_loss: 0.8740 - val_accuracy: 0.7380
Epoch 8/100
266/266 [==============================] - 17s 63ms/step - loss: 0.8427 - accuracy: 0.7510 - val_loss: 0.8358 - val_accuracy: 0.7540
Epoch 9/100
266/266 [==============================] - 17s 63ms/step - loss: 0.7805 - accuracy: 0.7752 - val_loss: 0.8136 - val_accuracy: 0.7553
Epoch 10/100
266/266 [==============================] - 17s 62ms/step - loss: 0.7407 - accuracy: 0.7805 - val_loss: 0.7898 - val_accuracy: 0.7593
Epoch 11/100
266/266 [==============================] - 17s 63ms/step - loss: 0.7117 - accuracy: 0.7857 - val_loss: 0.7721 - val_accuracy: 0.7726
Epoch 12/100
266/266 [==============================] - 17s 62ms/step - loss: 0.6848 - accuracy: 0.7964 - val_loss: 0.7557 - val_accuracy: 0.7713
Epoch 13/100
266/266 [==============================] - 17s 63ms/step - loss: 0.6945 - accuracy: 0.7943 - val_loss: 0.7392 - val_accuracy: 0.7779
Epoch 14/100
266/266 [==============================] - 17s 63ms/step - loss: 0.6497 - accuracy: 0.8063 - val_loss: 0.7273 - val_accuracy: 0.7753
Epoch 15/100
266/266 [==============================] - 17s 63ms/step - loss: 0.6291 - accuracy: 0.8095 - val_loss: 0.7218 - val_accuracy: 0.7786
Epoch 16/100
266/266 [==============================] - 17s 62ms/step - loss: 0.6267 - accuracy: 0.8099 - val_loss: 0.7100 - val_accuracy: 0.7846
Epoch 17/100
266/266 [==============================] - 17s 63ms/step - loss: 0.6166 - accuracy: 0.8192 - val_loss: 0.7000 - val_accuracy: 0.7846
Epoch 18/100
266/266 [==============================] - 17s 63ms/step - loss: 0.6070 - accuracy: 0.8176 - val_loss: 0.6906 - val_accuracy: 0.7945
Epoch 19/100
266/266 [==============================] - 17s 63ms/step - loss: 0.5799 - accuracy: 0.8278 - val_loss: 0.6862 - val_accuracy: 0.7859
Epoch 20/100
266/266 [==============================] - 17s 62ms/step - loss: 0.5675 - accuracy: 0.8293 - val_loss: 0.6763 - val_accuracy: 0.7926
Epoch 21/100
266/266 [==============================] - 17s 63ms/step - loss: 0.5657 - accuracy: 0.8323 - val_loss: 0.6696 - val_accuracy: 0.7906
Epoch 22/100
266/266 [==============================] - 17s 62ms/step - loss: 0.5628 - accuracy: 0.8287 - val_loss: 0.6698 - val_accuracy: 0.7939
Epoch 23/100
266/266 [==============================] - 17s 63ms/step - loss: 0.5416 - accuracy: 0.8378 - val_loss: 0.6575 - val_accuracy: 0.7972
Epoch 24/100
266/266 [==============================] - 17s 62ms/step - loss: 0.5222 - accuracy: 0.8438 - val_loss: 0.6569 - val_accuracy: 0.7979
Epoch 25/100
266/266 [==============================] - 17s 63ms/step - loss: 0.5179 - accuracy: 0.8421 - val_loss: 0.6488 - val_accuracy: 0.8019
Epoch 26/100
266/266 [==============================] - 17s 63ms/step - loss: 0.5092 - accuracy: 0.8464 - val_loss: 0.6466 - val_accuracy: 0.8025
Epoch 27/100
266/266 [==============================] - 17s 63ms/step - loss: 0.5217 - accuracy: 0.8426 - val_loss: 0.6474 - val_accuracy: 0.7979
Epoch 28/100
266/266 [==============================] - 17s 63ms/step - loss: 0.5044 - accuracy: 0.8490 - val_loss: 0.6386 - val_accuracy: 0.8059
Epoch 29/100
266/266 [==============================] - 17s 63ms/step - loss: 0.4770 - accuracy: 0.8626 - val_loss: 0.6412 - val_accuracy: 0.8025
Epoch 30/100
266/266 [==============================] - 17s 63ms/step - loss: 0.4746 - accuracy: 0.8589 - val_loss: 0.6323 - val_accuracy: 0.8065
Epoch 31/100
266/266 [==============================] - 17s 63ms/step - loss: 0.4783 - accuracy: 0.8553 - val_loss: 0.6363 - val_accuracy: 0.8005
Epoch 32/100
266/266 [==============================] - 17s 63ms/step - loss: 0.4661 - accuracy: 0.8664 - val_loss: 0.6358 - val_accuracy: 0.7952
Epoch 33/100
266/266 [==============================] - 17s 63ms/step - loss: 0.4591 - accuracy: 0.8639 - val_loss: 0.6232 - val_accuracy: 0.8105
Epoch 34/100
266/266 [==============================] - 17s 63ms/step - loss: 0.4491 - accuracy: 0.8697 - val_loss: 0.6229 - val_accuracy: 0.8125
Epoch 35/100
266/266 [==============================] - 17s 63ms/step - loss: 0.4591 - accuracy: 0.8663 - val_loss: 0.6247 - val_accuracy: 0.8059
Epoch 36/100
266/266 [==============================] - 17s 62ms/step - loss: 0.4296 - accuracy: 0.8740 - val_loss: 0.6223 - val_accuracy: 0.8025
Epoch 37/100
266/266 [==============================] - 17s 63ms/step - loss: 0.4360 - accuracy: 0.8708 - val_loss: 0.6223 - val_accuracy: 0.8085
Epoch 38/100
266/266 [==============================] - 17s 63ms/step - loss: 0.4228 - accuracy: 0.8741 - val_loss: 0.6168 - val_accuracy: 0.8059
Epoch 39/100
266/266 [==============================] - 17s 63ms/step - loss: 0.4196 - accuracy: 0.8804 - val_loss: 0.6123 - val_accuracy: 0.8118
Epoch 40/100
266/266 [==============================] - 17s 63ms/step - loss: 0.4236 - accuracy: 0.8778 - val_loss: 0.6132 - val_accuracy: 0.8112
Epoch 41/100
266/266 [==============================] - 17s 63ms/step - loss: 0.3997 - accuracy: 0.8802 - val_loss: 0.6105 - val_accuracy: 0.8105
Epoch 42/100
266/266 [==============================] - 17s 63ms/step - loss: 0.4159 - accuracy: 0.8761 - val_loss: 0.6156 - val_accuracy: 0.8085
Epoch 43/100
266/266 [==============================] - 17s 63ms/step - loss: 0.3997 - accuracy: 0.8839 - val_loss: 0.6134 - val_accuracy: 0.8078
Epoch 44/100
266/266 [==============================] - 17s 63ms/step - loss: 0.3832 - accuracy: 0.8893 - val_loss: 0.6098 - val_accuracy: 0.8098
Epoch 45/100
266/266 [==============================] - 17s 62ms/step - loss: 0.3973 - accuracy: 0.8834 - val_loss: 0.6056 - val_accuracy: 0.8098
Epoch 46/100
266/266 [==============================] - 17s 63ms/step - loss: 0.3876 - accuracy: 0.8890 - val_loss: 0.6041 - val_accuracy: 0.8125
Epoch 47/100
266/266 [==============================] - 17s 63ms/step - loss: 0.3929 - accuracy: 0.8867 - val_loss: 0.6090 - val_accuracy: 0.8045
Epoch 48/100
266/266 [==============================] - 17s 63ms/step - loss: 0.3713 - accuracy: 0.8907 - val_loss: 0.6064 - val_accuracy: 0.8112
Epoch 49/100
266/266 [==============================] - 17s 63ms/step - loss: 0.3712 - accuracy: 0.8962 - val_loss: 0.6023 - val_accuracy: 0.8098
Epoch 50/100
266/266 [==============================] - 17s 63ms/step - loss: 0.3693 - accuracy: 0.8945 - val_loss: 0.6069 - val_accuracy: 0.8032
Epoch 51/100
266/266 [==============================] - 17s 63ms/step - loss: 0.3702 - accuracy: 0.8995 - val_loss: 0.6019 - val_accuracy: 0.8105
Epoch 52/100
266/266 [==============================] - 17s 63ms/step - loss: 0.3696 - accuracy: 0.8919 - val_loss: 0.6043 - val_accuracy: 0.8065
Epoch 53/100
266/266 [==============================] - 17s 63ms/step - loss: 0.3706 - accuracy: 0.8928 - val_loss: 0.6036 - val_accuracy: 0.8065
Epoch 54/100
266/266 [==============================] - 17s 63ms/step - loss: 0.3533 - accuracy: 0.9018 - val_loss: 0.6084 - val_accuracy: 0.8059
Epoch 55/100
266/266 [==============================] - 17s 63ms/step - loss: 0.3567 - accuracy: 0.9012 - val_loss: 0.6011 - val_accuracy: 0.8085
Epoch 56/100
266/266 [==============================] - 17s 63ms/step - loss: 0.3426 - accuracy: 0.8986 - val_loss: 0.6045 - val_accuracy: 0.8025
Epoch 57/100
266/266 [==============================] - 17s 63ms/step - loss: 0.3439 - accuracy: 0.9069 - val_loss: 0.5991 - val_accuracy: 0.8112
Epoch 58/100
266/266 [==============================] - 17s 63ms/step - loss: 0.3369 - accuracy: 0.9071 - val_loss: 0.6005 - val_accuracy: 0.8098
Epoch 59/100
266/266 [==============================] - 17s 63ms/step - loss: 0.3369 - accuracy: 0.9088 - val_loss: 0.6033 - val_accuracy: 0.8032
Epoch 60/100
266/266 [==============================] - 17s 63ms/step - loss: 0.3472 - accuracy: 0.8996 - val_loss: 0.6023 - val_accuracy: 0.8085
Epoch 61/100
266/266 [==============================] - 17s 63ms/step - loss: 0.3314 - accuracy: 0.9104 - val_loss: 0.6040 - val_accuracy: 0.8098
Epoch 62/100
266/266 [==============================] - 17s 63ms/step - loss: 0.3278 - accuracy: 0.9128 - val_loss: 0.5981 - val_accuracy: 0.8125
Epoch 63/100
266/266 [==============================] - 17s 63ms/step - loss: 0.3151 - accuracy: 0.9149 - val_loss: 0.6057 - val_accuracy: 0.8059
Epoch 64/100
266/266 [==============================] - 17s 63ms/step - loss: 0.3097 - accuracy: 0.9205 - val_loss: 0.6018 - val_accuracy: 0.8092
Epoch 65/100
266/266 [==============================] - 17s 63ms/step - loss: 0.3196 - accuracy: 0.9142 - val_loss: 0.6051 - val_accuracy: 0.8065
Epoch 66/100
266/266 [==============================] - 17s 63ms/step - loss: 0.3088 - accuracy: 0.9142 - val_loss: 0.5959 - val_accuracy: 0.8078
Epoch 67/100
266/266 [==============================] - 17s 63ms/step - loss: 0.3095 - accuracy: 0.9146 - val_loss: 0.6005 - val_accuracy: 0.8085
Epoch 68/100
266/266 [==============================] - 17s 63ms/step - loss: 0.3080 - accuracy: 0.9180 - val_loss: 0.5954 - val_accuracy: 0.8105
Epoch 69/100
266/266 [==============================] - 17s 63ms/step - loss: 0.3163 - accuracy: 0.9116 - val_loss: 0.6002 - val_accuracy: 0.8072
Epoch 70/100
266/266 [==============================] - 17s 63ms/step - loss: 0.3089 - accuracy: 0.9116 - val_loss: 0.5986 - val_accuracy: 0.8085
Epoch 71/100
266/266 [==============================] - 17s 63ms/step - loss: 0.3002 - accuracy: 0.9240 - val_loss: 0.6025 - val_accuracy: 0.8065
Epoch 72/100
266/266 [==============================] - 17s 63ms/step - loss: 0.2930 - accuracy: 0.9214 - val_loss: 0.5992 - val_accuracy: 0.8112
Epoch 73/100
266/266 [==============================] - 17s 63ms/step - loss: 0.2977 - accuracy: 0.9236 - val_loss: 0.5990 - val_accuracy: 0.8118
Epoch 74/100
266/266 [==============================] - 17s 63ms/step - loss: 0.2964 - accuracy: 0.9212 - val_loss: 0.5984 - val_accuracy: 0.8112
Epoch 75/100
266/266 [==============================] - 17s 63ms/step - loss: 0.2956 - accuracy: 0.9226 - val_loss: 0.6014 - val_accuracy: 0.8112
Epoch 76/100
266/266 [==============================] - 17s 63ms/step - loss: 0.2862 - accuracy: 0.9224 - val_loss: 0.5979 - val_accuracy: 0.8132
Epoch 77/100
266/266 [==============================] - 17s 63ms/step - loss: 0.2828 - accuracy: 0.9263 - val_loss: 0.5955 - val_accuracy: 0.8098
Epoch 78/100
266/266 [==============================] - 17s 63ms/step - loss: 0.2916 - accuracy: 0.9245 - val_loss: 0.5970 - val_accuracy: 0.8092
Epoch 79/100
266/266 [==============================] - 17s 63ms/step - loss: 0.2785 - accuracy: 0.9269 - val_loss: 0.5959 - val_accuracy: 0.8118
Epoch 80/100
266/266 [==============================] - 17s 63ms/step - loss: 0.2739 - accuracy: 0.9298 - val_loss: 0.6008 - val_accuracy: 0.8098
Epoch 81/100
266/266 [==============================] - 17s 63ms/step - loss: 0.2779 - accuracy: 0.9288 - val_loss: 0.5957 - val_accuracy: 0.8092
Epoch 82/100
266/266 [==============================] - 17s 63ms/step - loss: 0.2856 - accuracy: 0.9210 - val_loss: 0.5982 - val_accuracy: 0.8098
Epoch 83/100
266/266 [==============================] - 17s 63ms/step - loss: 0.2707 - accuracy: 0.9295 - val_loss: 0.6017 - val_accuracy: 0.8118
Epoch 84/100
266/266 [==============================] - 17s 63ms/step - loss: 0.2678 - accuracy: 0.9307 - val_loss: 0.5988 - val_accuracy: 0.8092
Epoch 85/100
266/266 [==============================] - 17s 63ms/step - loss: 0.2653 - accuracy: 0.9292 - val_loss: 0.5992 - val_accuracy: 0.8085
Epoch 86/100
266/266 [==============================] - 17s 63ms/step - loss: 0.2713 - accuracy: 0.9234 - val_loss: 0.5985 - val_accuracy: 0.8085
Epoch 87/100
266/266 [==============================] - 17s 63ms/step - loss: 0.2657 - accuracy: 0.9285 - val_loss: 0.6011 - val_accuracy: 0.8092
Epoch 88/100
266/266 [==============================] - 17s 63ms/step - loss: 0.2629 - accuracy: 0.9326 - val_loss: 0.6024 - val_accuracy: 0.8105
Epoch 89/100
266/266 [==============================] - 17s 63ms/step - loss: 0.2666 - accuracy: 0.9288 - val_loss: 0.6054 - val_accuracy: 0.8085
Epoch 90/100
266/266 [==============================] - 17s 63ms/step - loss: 0.2626 - accuracy: 0.9303 - val_loss: 0.6009 - val_accuracy: 0.8105
Epoch 91/100
266/266 [==============================] - 17s 64ms/step - loss: 0.2551 - accuracy: 0.9355 - val_loss: 0.6013 - val_accuracy: 0.8118
Epoch 92/100
266/266 [==============================] - 17s 63ms/step - loss: 0.2483 - accuracy: 0.9347 - val_loss: 0.6023 - val_accuracy: 0.8125
Epoch 93/100
266/266 [==============================] - 17s 63ms/step - loss: 0.2552 - accuracy: 0.9336 - val_loss: 0.5996 - val_accuracy: 0.8138
Epoch 94/100
266/266 [==============================] - 17s 63ms/step - loss: 0.2500 - accuracy: 0.9369 - val_loss: 0.6046 - val_accuracy: 0.8092
Epoch 95/100
266/266 [==============================] - 17s 63ms/step - loss: 0.2488 - accuracy: 0.9387 - val_loss: 0.6074 - val_accuracy: 0.8105
Epoch 96/100
266/266 [==============================] - 17s 63ms/step - loss: 0.2530 - accuracy: 0.9308 - val_loss: 0.6086 - val_accuracy: 0.8085
Epoch 97/100
266/266 [==============================] - 17s 63ms/step - loss: 0.2373 - accuracy: 0.9428 - val_loss: 0.5996 - val_accuracy: 0.8092
Epoch 98/100
266/266 [==============================] - 17s 63ms/step - loss: 0.2511 - accuracy: 0.9363 - val_loss: 0.6006 - val_accuracy: 0.8092
Epoch 99/100
266/266 [==============================] - 17s 63ms/step - loss: 0.2536 - accuracy: 0.9323 - val_loss: 0.6050 - val_accuracy: 0.8172
Epoch 100/100
266/266 [==============================] - 17s 63ms/step - loss: 0.2353 - accuracy: 0.9427 - val_loss: 0.5975 - val_accuracy: 0.8105
In [ ]:
loss, accuracy = model_report(MobileNetV2_MODEL, MobileNetV2_MODEL_history, test_ds_res)
accuracies["MOBILENET_NONE"] = accuracy
Test set evaluation metrics
---------------------------
Loss:     0.605
Accuracy: 81.944%
Εκπαίδευση κεφαλής ταξινόμησης και ορισμένων συνελικτικών επιπέδων που βρίσκονται κοντά σε αυτή
In [ ]:
# transfer learning: MobileNet trained on ImageNet without the top layer

def init_MobileNetV2_model(summary, optimizer = tf.optimizers.Adam, lr = 0.00005):
  mobilenetV2_model=tf.keras.applications.MobileNetV2(input_shape=(IMG_SIZE,IMG_SIZE,3), include_top=False, weights='imagenet')
  
  MobileNetV2_MODEL=mobilenetV2_model.layers[0](mobilenetV2_model)

  for layer in MobileNetV2_MODEL.layers[:152]:
    layer.trainable=False
  for layer in MobileNetV2_MODEL.layers[152:]:
    layer.trainable=True
  
  dropout_layer = tf.keras.layers.Dropout(rate = 0.5)
  global_average_layer = tf.keras.layers.GlobalAveragePooling2D()

  # add top layer for CIFAR100 classification
  prediction_layer = tf.keras.layers.Dense(CLASSES_NUM,activation='softmax')
  model = tf.keras.Sequential([MobileNetV2_MODEL, dropout_layer, global_average_layer, prediction_layer])
  model.compile(optimizer=optimizer(learning_rate = lr), loss=tf.keras.losses.sparse_categorical_crossentropy, metrics=["accuracy"])
  if summary: 
    model.summary()
  return model
In [ ]:
MobileNetV2_MODEL = init_MobileNetV2_model(True)
MobileNetV2_MODEL_history = train_model(MobileNetV2_MODEL, train_ds_res, validation_ds_res)
Model: "sequential_7"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
mobilenetv2_1.00_224 (Functi (None, 7, 7, 1280)        2257984   
_________________________________________________________________
dropout_4 (Dropout)          (None, 7, 7, 1280)        0         
_________________________________________________________________
global_average_pooling2d_4 ( (None, 1280)              0         
_________________________________________________________________
dense_11 (Dense)             (None, 20)                25620     
=================================================================
Total params: 2,283,604
Trainable params: 28,180
Non-trainable params: 2,255,424
_________________________________________________________________
Epoch 1/100
266/266 [==============================] - 20s 66ms/step - loss: 2.7484 - accuracy: 0.1753 - val_loss: 1.8299 - val_accuracy: 0.4874
Epoch 2/100
266/266 [==============================] - 17s 63ms/step - loss: 1.7015 - accuracy: 0.5541 - val_loss: 1.3509 - val_accuracy: 0.6277
Epoch 3/100
266/266 [==============================] - 17s 63ms/step - loss: 1.2929 - accuracy: 0.6644 - val_loss: 1.1246 - val_accuracy: 0.6888
Epoch 4/100
266/266 [==============================] - 17s 63ms/step - loss: 1.1149 - accuracy: 0.6983 - val_loss: 1.0117 - val_accuracy: 0.7061
Epoch 5/100
266/266 [==============================] - 17s 63ms/step - loss: 0.9749 - accuracy: 0.7283 - val_loss: 0.9258 - val_accuracy: 0.7267
Epoch 6/100
266/266 [==============================] - 17s 63ms/step - loss: 0.9132 - accuracy: 0.7444 - val_loss: 0.8744 - val_accuracy: 0.7420
Epoch 7/100
266/266 [==============================] - 17s 63ms/step - loss: 0.8571 - accuracy: 0.7517 - val_loss: 0.8313 - val_accuracy: 0.7493
Epoch 8/100
266/266 [==============================] - 17s 63ms/step - loss: 0.8094 - accuracy: 0.7634 - val_loss: 0.8038 - val_accuracy: 0.7586
Epoch 9/100
266/266 [==============================] - 17s 63ms/step - loss: 0.7949 - accuracy: 0.7703 - val_loss: 0.7760 - val_accuracy: 0.7633
Epoch 10/100
266/266 [==============================] - 17s 63ms/step - loss: 0.7385 - accuracy: 0.7859 - val_loss: 0.7554 - val_accuracy: 0.7680
Epoch 11/100
266/266 [==============================] - 17s 63ms/step - loss: 0.7251 - accuracy: 0.7878 - val_loss: 0.7413 - val_accuracy: 0.7706
Epoch 12/100
266/266 [==============================] - 17s 63ms/step - loss: 0.6958 - accuracy: 0.7986 - val_loss: 0.7223 - val_accuracy: 0.7753
Epoch 13/100
266/266 [==============================] - 17s 63ms/step - loss: 0.6795 - accuracy: 0.7970 - val_loss: 0.7102 - val_accuracy: 0.7812
Epoch 14/100
266/266 [==============================] - 17s 63ms/step - loss: 0.6650 - accuracy: 0.8061 - val_loss: 0.7002 - val_accuracy: 0.7806
Epoch 15/100
266/266 [==============================] - 17s 63ms/step - loss: 0.6491 - accuracy: 0.8048 - val_loss: 0.6885 - val_accuracy: 0.7779
Epoch 16/100
266/266 [==============================] - 17s 63ms/step - loss: 0.6361 - accuracy: 0.8113 - val_loss: 0.6808 - val_accuracy: 0.7879
Epoch 17/100
266/266 [==============================] - 17s 63ms/step - loss: 0.6024 - accuracy: 0.8214 - val_loss: 0.6757 - val_accuracy: 0.7872
Epoch 18/100
266/266 [==============================] - 17s 63ms/step - loss: 0.5955 - accuracy: 0.8228 - val_loss: 0.6624 - val_accuracy: 0.7912
Epoch 19/100
266/266 [==============================] - 17s 63ms/step - loss: 0.5881 - accuracy: 0.8221 - val_loss: 0.6567 - val_accuracy: 0.7899
Epoch 20/100
266/266 [==============================] - 17s 63ms/step - loss: 0.5784 - accuracy: 0.8247 - val_loss: 0.6534 - val_accuracy: 0.7952
Epoch 21/100
266/266 [==============================] - 17s 63ms/step - loss: 0.5594 - accuracy: 0.8324 - val_loss: 0.6494 - val_accuracy: 0.7979
Epoch 22/100
266/266 [==============================] - 17s 63ms/step - loss: 0.5436 - accuracy: 0.8397 - val_loss: 0.6464 - val_accuracy: 0.7939
Epoch 23/100
266/266 [==============================] - 17s 63ms/step - loss: 0.5478 - accuracy: 0.8369 - val_loss: 0.6425 - val_accuracy: 0.7999
Epoch 24/100
266/266 [==============================] - 17s 63ms/step - loss: 0.5424 - accuracy: 0.8385 - val_loss: 0.6417 - val_accuracy: 0.7972
Epoch 25/100
266/266 [==============================] - 17s 63ms/step - loss: 0.5241 - accuracy: 0.8472 - val_loss: 0.6331 - val_accuracy: 0.8012
Epoch 26/100
266/266 [==============================] - 17s 63ms/step - loss: 0.5129 - accuracy: 0.8452 - val_loss: 0.6309 - val_accuracy: 0.8092
Epoch 27/100
266/266 [==============================] - 17s 63ms/step - loss: 0.5181 - accuracy: 0.8531 - val_loss: 0.6287 - val_accuracy: 0.8045
Epoch 28/100
266/266 [==============================] - 17s 63ms/step - loss: 0.4961 - accuracy: 0.8533 - val_loss: 0.6290 - val_accuracy: 0.8039
Epoch 29/100
266/266 [==============================] - 17s 63ms/step - loss: 0.4823 - accuracy: 0.8601 - val_loss: 0.6206 - val_accuracy: 0.8059
Epoch 30/100
266/266 [==============================] - 17s 63ms/step - loss: 0.4663 - accuracy: 0.8660 - val_loss: 0.6133 - val_accuracy: 0.8085
Epoch 31/100
266/266 [==============================] - 17s 63ms/step - loss: 0.4585 - accuracy: 0.8653 - val_loss: 0.6163 - val_accuracy: 0.8092
Epoch 32/100
266/266 [==============================] - 17s 63ms/step - loss: 0.4753 - accuracy: 0.8611 - val_loss: 0.6142 - val_accuracy: 0.8092
Epoch 33/100
266/266 [==============================] - 17s 63ms/step - loss: 0.4445 - accuracy: 0.8661 - val_loss: 0.6132 - val_accuracy: 0.8125
Epoch 34/100
266/266 [==============================] - 17s 63ms/step - loss: 0.4331 - accuracy: 0.8717 - val_loss: 0.6111 - val_accuracy: 0.8158
Epoch 35/100
266/266 [==============================] - 17s 63ms/step - loss: 0.4275 - accuracy: 0.8754 - val_loss: 0.6099 - val_accuracy: 0.8105
Epoch 36/100
266/266 [==============================] - 17s 63ms/step - loss: 0.4359 - accuracy: 0.8782 - val_loss: 0.6055 - val_accuracy: 0.8112
Epoch 37/100
266/266 [==============================] - 17s 63ms/step - loss: 0.4330 - accuracy: 0.8706 - val_loss: 0.6091 - val_accuracy: 0.8092
Epoch 38/100
266/266 [==============================] - 17s 63ms/step - loss: 0.4315 - accuracy: 0.8782 - val_loss: 0.6077 - val_accuracy: 0.8098
Epoch 39/100
266/266 [==============================] - 17s 63ms/step - loss: 0.4201 - accuracy: 0.8772 - val_loss: 0.6042 - val_accuracy: 0.8165
Epoch 40/100
266/266 [==============================] - 17s 63ms/step - loss: 0.4237 - accuracy: 0.8755 - val_loss: 0.6052 - val_accuracy: 0.8132
Epoch 41/100
266/266 [==============================] - 17s 63ms/step - loss: 0.4228 - accuracy: 0.8734 - val_loss: 0.6020 - val_accuracy: 0.8152
Epoch 42/100
266/266 [==============================] - 17s 63ms/step - loss: 0.3919 - accuracy: 0.8813 - val_loss: 0.6013 - val_accuracy: 0.8098
Epoch 43/100
266/266 [==============================] - 17s 63ms/step - loss: 0.4043 - accuracy: 0.8769 - val_loss: 0.6000 - val_accuracy: 0.8158
Epoch 44/100
266/266 [==============================] - 17s 63ms/step - loss: 0.4058 - accuracy: 0.8826 - val_loss: 0.5997 - val_accuracy: 0.8118
Epoch 45/100
266/266 [==============================] - 17s 63ms/step - loss: 0.3945 - accuracy: 0.8835 - val_loss: 0.5970 - val_accuracy: 0.8172
Epoch 46/100
266/266 [==============================] - 17s 63ms/step - loss: 0.3904 - accuracy: 0.8886 - val_loss: 0.5951 - val_accuracy: 0.8172
Epoch 47/100
266/266 [==============================] - 17s 63ms/step - loss: 0.3752 - accuracy: 0.8894 - val_loss: 0.5983 - val_accuracy: 0.8132
Epoch 48/100
266/266 [==============================] - 17s 63ms/step - loss: 0.3933 - accuracy: 0.8885 - val_loss: 0.5985 - val_accuracy: 0.8118
Epoch 49/100
266/266 [==============================] - 17s 63ms/step - loss: 0.3704 - accuracy: 0.8935 - val_loss: 0.5937 - val_accuracy: 0.8198
Epoch 50/100
266/266 [==============================] - 17s 63ms/step - loss: 0.3771 - accuracy: 0.8855 - val_loss: 0.5942 - val_accuracy: 0.8198
Epoch 51/100
266/266 [==============================] - 17s 64ms/step - loss: 0.3735 - accuracy: 0.8947 - val_loss: 0.5957 - val_accuracy: 0.8178
Epoch 52/100
266/266 [==============================] - 17s 63ms/step - loss: 0.3650 - accuracy: 0.8949 - val_loss: 0.5947 - val_accuracy: 0.8211
Epoch 53/100
266/266 [==============================] - 17s 64ms/step - loss: 0.3537 - accuracy: 0.9019 - val_loss: 0.5896 - val_accuracy: 0.8191
Epoch 54/100
266/266 [==============================] - 17s 64ms/step - loss: 0.3553 - accuracy: 0.8966 - val_loss: 0.5902 - val_accuracy: 0.8165
Epoch 55/100
266/266 [==============================] - 17s 64ms/step - loss: 0.3635 - accuracy: 0.8977 - val_loss: 0.5875 - val_accuracy: 0.8205
Epoch 56/100
266/266 [==============================] - 17s 64ms/step - loss: 0.3509 - accuracy: 0.9000 - val_loss: 0.5913 - val_accuracy: 0.8205
Epoch 57/100
266/266 [==============================] - 17s 63ms/step - loss: 0.3477 - accuracy: 0.9005 - val_loss: 0.5939 - val_accuracy: 0.8158
Epoch 58/100
266/266 [==============================] - 17s 64ms/step - loss: 0.3355 - accuracy: 0.9041 - val_loss: 0.5917 - val_accuracy: 0.8145
Epoch 59/100
266/266 [==============================] - 17s 63ms/step - loss: 0.3332 - accuracy: 0.9046 - val_loss: 0.5866 - val_accuracy: 0.8185
Epoch 60/100
266/266 [==============================] - 17s 63ms/step - loss: 0.3374 - accuracy: 0.9026 - val_loss: 0.5899 - val_accuracy: 0.8178
Epoch 61/100
266/266 [==============================] - 17s 64ms/step - loss: 0.3266 - accuracy: 0.9034 - val_loss: 0.5913 - val_accuracy: 0.8172
Epoch 62/100
266/266 [==============================] - 17s 64ms/step - loss: 0.3283 - accuracy: 0.9062 - val_loss: 0.5873 - val_accuracy: 0.8191
Epoch 63/100
266/266 [==============================] - 17s 64ms/step - loss: 0.3493 - accuracy: 0.9028 - val_loss: 0.5932 - val_accuracy: 0.8158
Epoch 64/100
266/266 [==============================] - 17s 63ms/step - loss: 0.3159 - accuracy: 0.9077 - val_loss: 0.5911 - val_accuracy: 0.8172
Epoch 65/100
266/266 [==============================] - 17s 64ms/step - loss: 0.3397 - accuracy: 0.9019 - val_loss: 0.5919 - val_accuracy: 0.8178
Epoch 66/100
266/266 [==============================] - 17s 64ms/step - loss: 0.3151 - accuracy: 0.9139 - val_loss: 0.5919 - val_accuracy: 0.8165
Epoch 67/100
266/266 [==============================] - 17s 64ms/step - loss: 0.3091 - accuracy: 0.9116 - val_loss: 0.5952 - val_accuracy: 0.8138
Epoch 68/100
266/266 [==============================] - 17s 64ms/step - loss: 0.3076 - accuracy: 0.9137 - val_loss: 0.5899 - val_accuracy: 0.8185
Epoch 69/100
266/266 [==============================] - 17s 64ms/step - loss: 0.2990 - accuracy: 0.9159 - val_loss: 0.5896 - val_accuracy: 0.8158
Epoch 70/100
266/266 [==============================] - 17s 63ms/step - loss: 0.3269 - accuracy: 0.9070 - val_loss: 0.5886 - val_accuracy: 0.8205
Epoch 71/100
266/266 [==============================] - 17s 63ms/step - loss: 0.3106 - accuracy: 0.9110 - val_loss: 0.5872 - val_accuracy: 0.8211
Epoch 72/100
266/266 [==============================] - 17s 64ms/step - loss: 0.3047 - accuracy: 0.9158 - val_loss: 0.5870 - val_accuracy: 0.8165
Epoch 73/100
266/266 [==============================] - 17s 64ms/step - loss: 0.3025 - accuracy: 0.9122 - val_loss: 0.5904 - val_accuracy: 0.8198
Epoch 74/100
266/266 [==============================] - 17s 64ms/step - loss: 0.3051 - accuracy: 0.9185 - val_loss: 0.5877 - val_accuracy: 0.8211
Epoch 75/100
266/266 [==============================] - 17s 64ms/step - loss: 0.2947 - accuracy: 0.9170 - val_loss: 0.5930 - val_accuracy: 0.8178
Epoch 76/100
266/266 [==============================] - 17s 64ms/step - loss: 0.2959 - accuracy: 0.9198 - val_loss: 0.5861 - val_accuracy: 0.8231
Epoch 77/100
266/266 [==============================] - 17s 64ms/step - loss: 0.2937 - accuracy: 0.9165 - val_loss: 0.5875 - val_accuracy: 0.8172
Epoch 78/100
266/266 [==============================] - 17s 64ms/step - loss: 0.2930 - accuracy: 0.9178 - val_loss: 0.5903 - val_accuracy: 0.8158
Epoch 79/100
266/266 [==============================] - 17s 64ms/step - loss: 0.2798 - accuracy: 0.9205 - val_loss: 0.5895 - val_accuracy: 0.8172
Epoch 80/100
266/266 [==============================] - 17s 64ms/step - loss: 0.2883 - accuracy: 0.9208 - val_loss: 0.5944 - val_accuracy: 0.8185
Epoch 81/100
266/266 [==============================] - 17s 64ms/step - loss: 0.2750 - accuracy: 0.9252 - val_loss: 0.5927 - val_accuracy: 0.8191
Epoch 82/100
266/266 [==============================] - 17s 64ms/step - loss: 0.2795 - accuracy: 0.9273 - val_loss: 0.5910 - val_accuracy: 0.8158
Epoch 83/100
266/266 [==============================] - 17s 64ms/step - loss: 0.2687 - accuracy: 0.9261 - val_loss: 0.5911 - val_accuracy: 0.8152
Epoch 84/100
266/266 [==============================] - 17s 64ms/step - loss: 0.2814 - accuracy: 0.9206 - val_loss: 0.5933 - val_accuracy: 0.8152
Epoch 85/100
266/266 [==============================] - 17s 64ms/step - loss: 0.2693 - accuracy: 0.9272 - val_loss: 0.5964 - val_accuracy: 0.8152
Epoch 86/100
266/266 [==============================] - 17s 63ms/step - loss: 0.2739 - accuracy: 0.9264 - val_loss: 0.5929 - val_accuracy: 0.8172
Epoch 87/100
266/266 [==============================] - 17s 64ms/step - loss: 0.2655 - accuracy: 0.9272 - val_loss: 0.5938 - val_accuracy: 0.8178
Epoch 88/100
266/266 [==============================] - 17s 64ms/step - loss: 0.2721 - accuracy: 0.9237 - val_loss: 0.5992 - val_accuracy: 0.8165
Epoch 89/100
266/266 [==============================] - 17s 63ms/step - loss: 0.2726 - accuracy: 0.9251 - val_loss: 0.5916 - val_accuracy: 0.8211
Epoch 90/100
266/266 [==============================] - 17s 64ms/step - loss: 0.2549 - accuracy: 0.9304 - val_loss: 0.5971 - val_accuracy: 0.8178
Epoch 91/100
266/266 [==============================] - 17s 64ms/step - loss: 0.2568 - accuracy: 0.9295 - val_loss: 0.6031 - val_accuracy: 0.8158
Epoch 92/100
266/266 [==============================] - 17s 64ms/step - loss: 0.2607 - accuracy: 0.9297 - val_loss: 0.5901 - val_accuracy: 0.8172
Epoch 93/100
266/266 [==============================] - 17s 64ms/step - loss: 0.2615 - accuracy: 0.9271 - val_loss: 0.5975 - val_accuracy: 0.8158
Epoch 94/100
266/266 [==============================] - 17s 64ms/step - loss: 0.2636 - accuracy: 0.9301 - val_loss: 0.5947 - val_accuracy: 0.8198
Epoch 95/100
266/266 [==============================] - 17s 64ms/step - loss: 0.2640 - accuracy: 0.9283 - val_loss: 0.6013 - val_accuracy: 0.8145
Epoch 96/100
266/266 [==============================] - 17s 64ms/step - loss: 0.2543 - accuracy: 0.9299 - val_loss: 0.5970 - val_accuracy: 0.8211
Epoch 97/100
266/266 [==============================] - 17s 64ms/step - loss: 0.2620 - accuracy: 0.9267 - val_loss: 0.6000 - val_accuracy: 0.8145
Epoch 98/100
266/266 [==============================] - 17s 64ms/step - loss: 0.2587 - accuracy: 0.9241 - val_loss: 0.5934 - val_accuracy: 0.8145
Epoch 99/100
266/266 [==============================] - 17s 64ms/step - loss: 0.2478 - accuracy: 0.9293 - val_loss: 0.5993 - val_accuracy: 0.8185
Epoch 100/100
266/266 [==============================] - 17s 64ms/step - loss: 0.2464 - accuracy: 0.9342 - val_loss: 0.5989 - val_accuracy: 0.8185
In [ ]:
loss, accuracy = model_report(MobileNetV2_MODEL, MobileNetV2_MODEL_history, test_ds_res)
accuracies["MOBILENET_FEW"] = accuracy
Test set evaluation metrics
---------------------------
Loss:     0.598
Accuracy: 82.044%
Εκπαίδευση ολόκληρου του δικτύου
In [ ]:
# transfer learning: MobileNet trained on ImageNet without the top layer

def init_MobileNetV2_model(summary, optimizer = tf.optimizers.Adam, lr = 0.00005):
  mobilenetV2_model=tf.keras.applications.MobileNetV2(input_shape=(IMG_SIZE,IMG_SIZE,3), include_top=False, weights='imagenet')
  
  MobileNetV2_MODEL=mobilenetV2_model.layers[0](mobilenetV2_model)

  # unfreeze conv layers
  MobileNetV2_MODEL.trainable=True
  
  dropout_layer = tf.keras.layers.Dropout(rate = 0.5)
  global_average_layer = tf.keras.layers.GlobalAveragePooling2D()

  # add top layer for CIFAR100 classification
  prediction_layer = tf.keras.layers.Dense(CLASSES_NUM,activation='softmax')
  model = tf.keras.Sequential([MobileNetV2_MODEL, dropout_layer, global_average_layer, prediction_layer])
  model.compile(optimizer=optimizer(learning_rate = lr), loss=tf.keras.losses.sparse_categorical_crossentropy, metrics=["accuracy"])
  if summary: 
    model.summary()
  return model
In [ ]:
MobileNetV2_MODEL = init_MobileNetV2_model(True)
MobileNetV2_MODEL_history = train_model(MobileNetV2_MODEL, train_ds_res, validation_ds_res)
Model: "sequential_8"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
mobilenetv2_1.00_224 (Functi (None, 7, 7, 1280)        2257984   
_________________________________________________________________
dropout_5 (Dropout)          (None, 7, 7, 1280)        0         
_________________________________________________________________
global_average_pooling2d_5 ( (None, 1280)              0         
_________________________________________________________________
dense_12 (Dense)             (None, 20)                25620     
=================================================================
Total params: 2,283,604
Trainable params: 2,249,492
Non-trainable params: 34,112
_________________________________________________________________
Epoch 1/100
266/266 [==============================] - 64s 227ms/step - loss: 1.7047 - accuracy: 0.5111 - val_loss: 2.7579 - val_accuracy: 0.3943
Epoch 2/100
266/266 [==============================] - 59s 222ms/step - loss: 0.3470 - accuracy: 0.8969 - val_loss: 2.8696 - val_accuracy: 0.4102
Epoch 3/100
266/266 [==============================] - 59s 222ms/step - loss: 0.1490 - accuracy: 0.9612 - val_loss: 3.0471 - val_accuracy: 0.3364
Epoch 4/100
266/266 [==============================] - 60s 224ms/step - loss: 0.0793 - accuracy: 0.9820 - val_loss: 4.1843 - val_accuracy: 0.2832
Epoch 5/100
266/266 [==============================] - 61s 228ms/step - loss: 0.0500 - accuracy: 0.9893 - val_loss: 3.4817 - val_accuracy: 0.3065
Epoch 6/100
266/266 [==============================] - 60s 225ms/step - loss: 0.0328 - accuracy: 0.9932 - val_loss: 3.8670 - val_accuracy: 0.2959
Epoch 7/100
266/266 [==============================] - 61s 228ms/step - loss: 0.0252 - accuracy: 0.9940 - val_loss: 2.9886 - val_accuracy: 0.3610
Epoch 8/100
266/266 [==============================] - 59s 223ms/step - loss: 0.0325 - accuracy: 0.9891 - val_loss: 3.1674 - val_accuracy: 0.3517
Epoch 9/100
266/266 [==============================] - 59s 222ms/step - loss: 0.0220 - accuracy: 0.9945 - val_loss: 1.8102 - val_accuracy: 0.5938
Epoch 10/100
266/266 [==============================] - 59s 223ms/step - loss: 0.0261 - accuracy: 0.9918 - val_loss: 2.1479 - val_accuracy: 0.5831
Epoch 11/100
266/266 [==============================] - 59s 223ms/step - loss: 0.0246 - accuracy: 0.9939 - val_loss: 1.6997 - val_accuracy: 0.6370
Epoch 12/100
266/266 [==============================] - 60s 226ms/step - loss: 0.0299 - accuracy: 0.9913 - val_loss: 1.1541 - val_accuracy: 0.7294
Epoch 13/100
266/266 [==============================] - 61s 228ms/step - loss: 0.0284 - accuracy: 0.9923 - val_loss: 0.9851 - val_accuracy: 0.7852
Epoch 14/100
266/266 [==============================] - 60s 227ms/step - loss: 0.0301 - accuracy: 0.9897 - val_loss: 1.1159 - val_accuracy: 0.7633
Epoch 15/100
266/266 [==============================] - 60s 224ms/step - loss: 0.0236 - accuracy: 0.9934 - val_loss: 1.0914 - val_accuracy: 0.7693
Epoch 16/100
266/266 [==============================] - 61s 228ms/step - loss: 0.0224 - accuracy: 0.9946 - val_loss: 1.0156 - val_accuracy: 0.7999
Epoch 17/100
266/266 [==============================] - 61s 228ms/step - loss: 0.0168 - accuracy: 0.9934 - val_loss: 0.9693 - val_accuracy: 0.8019
Epoch 18/100
266/266 [==============================] - 60s 226ms/step - loss: 0.0201 - accuracy: 0.9920 - val_loss: 1.1548 - val_accuracy: 0.7932
Epoch 19/100
266/266 [==============================] - 60s 224ms/step - loss: 0.0125 - accuracy: 0.9969 - val_loss: 0.7539 - val_accuracy: 0.8524
Epoch 20/100
266/266 [==============================] - 61s 228ms/step - loss: 0.0174 - accuracy: 0.9946 - val_loss: 0.7734 - val_accuracy: 0.8464
Epoch 21/100
266/266 [==============================] - 61s 228ms/step - loss: 0.0183 - accuracy: 0.9946 - val_loss: 0.9688 - val_accuracy: 0.8172
Epoch 22/100
266/266 [==============================] - 60s 225ms/step - loss: 0.0159 - accuracy: 0.9947 - val_loss: 0.9582 - val_accuracy: 0.8238
Epoch 23/100
266/266 [==============================] - 60s 226ms/step - loss: 0.0244 - accuracy: 0.9916 - val_loss: 0.8818 - val_accuracy: 0.8457
Epoch 24/100
266/266 [==============================] - 61s 228ms/step - loss: 0.0253 - accuracy: 0.9934 - val_loss: 0.9232 - val_accuracy: 0.8245
Epoch 25/100
266/266 [==============================] - 60s 225ms/step - loss: 0.0197 - accuracy: 0.9936 - val_loss: 0.6872 - val_accuracy: 0.8604
Epoch 26/100
266/266 [==============================] - 60s 225ms/step - loss: 0.0133 - accuracy: 0.9953 - val_loss: 0.6385 - val_accuracy: 0.8610
Epoch 27/100
266/266 [==============================] - 61s 228ms/step - loss: 0.0068 - accuracy: 0.9977 - val_loss: 0.7203 - val_accuracy: 0.8597
Epoch 28/100
266/266 [==============================] - 60s 225ms/step - loss: 0.0142 - accuracy: 0.9953 - val_loss: 0.7928 - val_accuracy: 0.8371
Epoch 29/100
266/266 [==============================] - 60s 224ms/step - loss: 0.0156 - accuracy: 0.9955 - val_loss: 0.8189 - val_accuracy: 0.8424
Epoch 30/100
266/266 [==============================] - 60s 226ms/step - loss: 0.0137 - accuracy: 0.9953 - val_loss: 0.7457 - val_accuracy: 0.8544
Epoch 31/100
266/266 [==============================] - 60s 225ms/step - loss: 0.0216 - accuracy: 0.9921 - val_loss: 0.8158 - val_accuracy: 0.8497
Epoch 32/100
266/266 [==============================] - 60s 224ms/step - loss: 0.0180 - accuracy: 0.9949 - val_loss: 1.1578 - val_accuracy: 0.7806
Epoch 33/100
266/266 [==============================] - 61s 228ms/step - loss: 0.0250 - accuracy: 0.9934 - val_loss: 0.8768 - val_accuracy: 0.8258
Epoch 34/100
266/266 [==============================] - 60s 227ms/step - loss: 0.0142 - accuracy: 0.9940 - val_loss: 0.8907 - val_accuracy: 0.8451
Epoch 35/100
266/266 [==============================] - 60s 226ms/step - loss: 0.0107 - accuracy: 0.9965 - val_loss: 0.9811 - val_accuracy: 0.8265
Epoch 36/100
266/266 [==============================] - 60s 226ms/step - loss: 0.0098 - accuracy: 0.9957 - val_loss: 0.7951 - val_accuracy: 0.8524
Epoch 37/100
266/266 [==============================] - 60s 225ms/step - loss: 0.0074 - accuracy: 0.9984 - val_loss: 0.7985 - val_accuracy: 0.8504
Epoch 38/100
266/266 [==============================] - 60s 227ms/step - loss: 0.0151 - accuracy: 0.9946 - val_loss: 0.9172 - val_accuracy: 0.8444
Epoch 39/100
266/266 [==============================] - 60s 225ms/step - loss: 0.0250 - accuracy: 0.9918 - val_loss: 1.0921 - val_accuracy: 0.8185
Epoch 40/100
266/266 [==============================] - 59s 223ms/step - loss: 0.0141 - accuracy: 0.9946 - val_loss: 0.9426 - val_accuracy: 0.8424
Epoch 41/100
266/266 [==============================] - 60s 226ms/step - loss: 0.0135 - accuracy: 0.9956 - val_loss: 0.9536 - val_accuracy: 0.8378
Epoch 42/100
266/266 [==============================] - 60s 225ms/step - loss: 0.0164 - accuracy: 0.9939 - val_loss: 1.0516 - val_accuracy: 0.8311
Epoch 43/100
266/266 [==============================] - 60s 227ms/step - loss: 0.0249 - accuracy: 0.9904 - val_loss: 0.9268 - val_accuracy: 0.8265
Epoch 44/100
266/266 [==============================] - 59s 223ms/step - loss: 0.0148 - accuracy: 0.9945 - val_loss: 0.6744 - val_accuracy: 0.8723
Epoch 45/100
266/266 [==============================] - 59s 222ms/step - loss: 0.0066 - accuracy: 0.9980 - val_loss: 0.6839 - val_accuracy: 0.8803
Epoch 46/100
266/266 [==============================] - 60s 227ms/step - loss: 0.0064 - accuracy: 0.9976 - val_loss: 0.6675 - val_accuracy: 0.8797
Epoch 47/100
266/266 [==============================] - 60s 226ms/step - loss: 0.0090 - accuracy: 0.9969 - val_loss: 0.7117 - val_accuracy: 0.8690
Epoch 48/100
266/266 [==============================] - 60s 227ms/step - loss: 0.0165 - accuracy: 0.9941 - val_loss: 0.8548 - val_accuracy: 0.8537
Epoch 49/100
266/266 [==============================] - 61s 228ms/step - loss: 0.0095 - accuracy: 0.9969 - val_loss: 0.8259 - val_accuracy: 0.8684
Epoch 50/100
266/266 [==============================] - 60s 224ms/step - loss: 0.0057 - accuracy: 0.9988 - val_loss: 0.7899 - val_accuracy: 0.8590
Epoch 51/100
266/266 [==============================] - 61s 228ms/step - loss: 0.0077 - accuracy: 0.9971 - val_loss: 0.8157 - val_accuracy: 0.8750
Epoch 52/100
266/266 [==============================] - 60s 224ms/step - loss: 0.0113 - accuracy: 0.9971 - val_loss: 0.8184 - val_accuracy: 0.8570
Epoch 53/100
266/266 [==============================] - 59s 224ms/step - loss: 0.0085 - accuracy: 0.9974 - val_loss: 0.8066 - val_accuracy: 0.8677
Epoch 54/100
266/266 [==============================] - 60s 225ms/step - loss: 0.0135 - accuracy: 0.9959 - val_loss: 0.8892 - val_accuracy: 0.8331
Epoch 55/100
266/266 [==============================] - 61s 229ms/step - loss: 0.0208 - accuracy: 0.9935 - val_loss: 0.9406 - val_accuracy: 0.8457
Epoch 56/100
266/266 [==============================] - 60s 224ms/step - loss: 0.0173 - accuracy: 0.9948 - val_loss: 0.7935 - val_accuracy: 0.8584
Epoch 57/100
266/266 [==============================] - 60s 224ms/step - loss: 0.0166 - accuracy: 0.9945 - val_loss: 0.8255 - val_accuracy: 0.8504
Epoch 58/100
266/266 [==============================] - 61s 228ms/step - loss: 0.0059 - accuracy: 0.9981 - val_loss: 0.8977 - val_accuracy: 0.8431
Epoch 59/100
266/266 [==============================] - 59s 224ms/step - loss: 0.0132 - accuracy: 0.9968 - val_loss: 0.8692 - val_accuracy: 0.8444
Epoch 60/100
266/266 [==============================] - 60s 227ms/step - loss: 0.0101 - accuracy: 0.9964 - val_loss: 0.8575 - val_accuracy: 0.8424
Epoch 61/100
266/266 [==============================] - 59s 223ms/step - loss: 0.0169 - accuracy: 0.9957 - val_loss: 0.8070 - val_accuracy: 0.8664
Epoch 62/100
266/266 [==============================] - 60s 227ms/step - loss: 0.0184 - accuracy: 0.9938 - val_loss: 0.9662 - val_accuracy: 0.8457
Epoch 63/100
266/266 [==============================] - 60s 224ms/step - loss: 0.0086 - accuracy: 0.9978 - val_loss: 0.7344 - val_accuracy: 0.8684
Epoch 64/100
266/266 [==============================] - 61s 229ms/step - loss: 0.0115 - accuracy: 0.9959 - val_loss: 0.8391 - val_accuracy: 0.8577
Epoch 65/100
266/266 [==============================] - 60s 225ms/step - loss: 0.0082 - accuracy: 0.9974 - val_loss: 0.8909 - val_accuracy: 0.8504
Epoch 66/100
266/266 [==============================] - 59s 223ms/step - loss: 0.0104 - accuracy: 0.9971 - val_loss: 0.8593 - val_accuracy: 0.8464
Epoch 67/100
266/266 [==============================] - 59s 222ms/step - loss: 0.0110 - accuracy: 0.9970 - val_loss: 1.1033 - val_accuracy: 0.8132
Epoch 68/100
266/266 [==============================] - 60s 224ms/step - loss: 0.0138 - accuracy: 0.9954 - val_loss: 0.9913 - val_accuracy: 0.8331
Epoch 69/100
266/266 [==============================] - 60s 226ms/step - loss: 0.0067 - accuracy: 0.9982 - val_loss: 0.9181 - val_accuracy: 0.8497
Epoch 70/100
266/266 [==============================] - 60s 226ms/step - loss: 0.0050 - accuracy: 0.9984 - val_loss: 0.9148 - val_accuracy: 0.8584
Epoch 71/100
266/266 [==============================] - 61s 228ms/step - loss: 0.0160 - accuracy: 0.9962 - val_loss: 0.9638 - val_accuracy: 0.8318
Epoch 72/100
266/266 [==============================] - 61s 229ms/step - loss: 0.0153 - accuracy: 0.9953 - val_loss: 0.9112 - val_accuracy: 0.8444
Epoch 73/100
266/266 [==============================] - 60s 226ms/step - loss: 0.0242 - accuracy: 0.9924 - val_loss: 0.7204 - val_accuracy: 0.8717
Epoch 74/100
266/266 [==============================] - 60s 225ms/step - loss: 0.0087 - accuracy: 0.9974 - val_loss: 0.8750 - val_accuracy: 0.8637
Epoch 75/100
266/266 [==============================] - 60s 224ms/step - loss: 0.0070 - accuracy: 0.9973 - val_loss: 0.7527 - val_accuracy: 0.8750
Epoch 76/100
266/266 [==============================] - 60s 226ms/step - loss: 0.0052 - accuracy: 0.9980 - val_loss: 0.7587 - val_accuracy: 0.8590
Epoch 77/100
266/266 [==============================] - 60s 225ms/step - loss: 0.0083 - accuracy: 0.9982 - val_loss: 0.7172 - val_accuracy: 0.8816
Epoch 78/100
266/266 [==============================] - 59s 223ms/step - loss: 0.0070 - accuracy: 0.9973 - val_loss: 0.8966 - val_accuracy: 0.8511
Epoch 79/100
266/266 [==============================] - 60s 225ms/step - loss: 0.0218 - accuracy: 0.9916 - val_loss: 0.7139 - val_accuracy: 0.8664
Epoch 80/100
266/266 [==============================] - 61s 228ms/step - loss: 0.0119 - accuracy: 0.9966 - val_loss: 0.7238 - val_accuracy: 0.8670
Epoch 81/100
266/266 [==============================] - 60s 227ms/step - loss: 0.0068 - accuracy: 0.9977 - val_loss: 0.7293 - val_accuracy: 0.8684
Epoch 82/100
266/266 [==============================] - 61s 228ms/step - loss: 0.0084 - accuracy: 0.9969 - val_loss: 0.8119 - val_accuracy: 0.8644
Epoch 83/100
266/266 [==============================] - 59s 223ms/step - loss: 0.0036 - accuracy: 0.9993 - val_loss: 0.8882 - val_accuracy: 0.8650
Epoch 84/100
266/266 [==============================] - 60s 225ms/step - loss: 0.0039 - accuracy: 0.9989 - val_loss: 0.7687 - val_accuracy: 0.8743
Epoch 85/100
266/266 [==============================] - 61s 229ms/step - loss: 0.0026 - accuracy: 0.9993 - val_loss: 0.7611 - val_accuracy: 0.8743
Epoch 86/100
266/266 [==============================] - 60s 226ms/step - loss: 0.0082 - accuracy: 0.9975 - val_loss: 0.8711 - val_accuracy: 0.8590
Epoch 87/100
266/266 [==============================] - 60s 224ms/step - loss: 0.0040 - accuracy: 0.9987 - val_loss: 0.8576 - val_accuracy: 0.8584
Epoch 88/100
266/266 [==============================] - 61s 228ms/step - loss: 0.0044 - accuracy: 0.9982 - val_loss: 0.7153 - val_accuracy: 0.8803
Epoch 89/100
266/266 [==============================] - 61s 231ms/step - loss: 0.0151 - accuracy: 0.9955 - val_loss: 0.9009 - val_accuracy: 0.8351
Epoch 90/100
266/266 [==============================] - 61s 229ms/step - loss: 0.0161 - accuracy: 0.9956 - val_loss: 0.7495 - val_accuracy: 0.8657
Epoch 91/100
266/266 [==============================] - 60s 227ms/step - loss: 0.0056 - accuracy: 0.9979 - val_loss: 0.7530 - val_accuracy: 0.8670
Epoch 92/100
266/266 [==============================] - 60s 224ms/step - loss: 0.0111 - accuracy: 0.9962 - val_loss: 0.9941 - val_accuracy: 0.8457
Epoch 93/100
266/266 [==============================] - 61s 228ms/step - loss: 0.0089 - accuracy: 0.9965 - val_loss: 0.7880 - val_accuracy: 0.8624
Epoch 94/100
266/266 [==============================] - 60s 227ms/step - loss: 0.0070 - accuracy: 0.9978 - val_loss: 0.6595 - val_accuracy: 0.8757
Epoch 95/100
266/266 [==============================] - 60s 224ms/step - loss: 0.0075 - accuracy: 0.9977 - val_loss: 0.6407 - val_accuracy: 0.8783
Epoch 96/100
266/266 [==============================] - 59s 224ms/step - loss: 0.0075 - accuracy: 0.9982 - val_loss: 0.8556 - val_accuracy: 0.8451
Epoch 97/100
266/266 [==============================] - 59s 223ms/step - loss: 0.0047 - accuracy: 0.9988 - val_loss: 0.7254 - val_accuracy: 0.8770
Epoch 98/100
266/266 [==============================] - 60s 226ms/step - loss: 0.0124 - accuracy: 0.9958 - val_loss: 0.5689 - val_accuracy: 0.8777
Epoch 99/100
266/266 [==============================] - 59s 223ms/step - loss: 0.0040 - accuracy: 0.9992 - val_loss: 0.7228 - val_accuracy: 0.8690
Epoch 100/100
266/266 [==============================] - 60s 225ms/step - loss: 0.0113 - accuracy: 0.9959 - val_loss: 0.8524 - val_accuracy: 0.8564
In [ ]:
loss, accuracy = model_report(MobileNetV2_MODEL, MobileNetV2_MODEL_history, test_ds_res)
losses["MOBILENET_ALL"] = loss
accuracies["MOBILENET_ALL"] = accuracy
Test set evaluation metrics
---------------------------
Loss:     0.799
Accuracy: 85.367%

DenseNet

Τέλος, εξετάζουμε το DenseNet. Στο μοντέλο αυτό κάθε επίπεδο λαμβάνει πρόσθετες εισόδους από όλα τα προηγούμενα και επιπλέον περνάει σε όλα τα επόμενα τα δικά του feature-maps. Συνεπώς, κάθε επίπεδο λαμβάνει μία "συλλογική γνώση" από όλα τα προηγούμενα επίπεδα. Αυτό επιτρέπει στο δίκτυο να είναι απλούστερο σε δομή και να έχει παραδείγματος χάριν μικρότερο αριθμό από channels. Στην ακόλουθη εικόνα φαίνεται ένα απλό DenseNet block:

dense
Εκπαίδευση μόνο της κεφαλής ταξινόμησης
In [ ]:
# transfer learning: DenseNet trained on ImageNet without the top layer

def init_DENSENET_model(summary, optimizer = tf.optimizers.Adam, lr = 0.00005):
  densenet_model=tf.keras.applications.densenet.DenseNet121(input_shape=(32,32,3), include_top=False, weights='imagenet')
  
  DENSENET_MODEL=densenet_model.layers[0](densenet_model)

  # freeze conv layers
  DENSENET_MODEL.trainable = False

  dropout_layer = tf.keras.layers.Dropout(rate = 0.5)
  global_average_layer = tf.keras.layers.GlobalAveragePooling2D()

  # add top layer for CIFAR100 classification
  prediction_layer = tf.keras.layers.Dense(CLASSES_NUM,activation='softmax')
  model = tf.keras.Sequential([DENSENET_MODEL, dropout_layer, global_average_layer, prediction_layer])
  model.compile(optimizer=optimizer(learning_rate = lr), loss=tf.keras.losses.sparse_categorical_crossentropy, metrics=["accuracy"])
  if summary: 
    model.summary()
  return model
In [ ]:
DENSENET_MODEL = init_DENSENET_model(True)
DENSENET_MODEL_history = train_model(DENSENET_MODEL)
Downloading data from https://storage.googleapis.com/tensorflow/keras-applications/densenet/densenet121_weights_tf_dim_ordering_tf_kernels_notop.h5
29089792/29084464 [==============================] - 0s 0us/step
Model: "sequential_9"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
densenet121 (Functional)     (None, 1, 1, 1024)        7037504   
_________________________________________________________________
dropout_6 (Dropout)          (None, 1, 1, 1024)        0         
_________________________________________________________________
global_average_pooling2d_6 ( (None, 1024)              0         
_________________________________________________________________
dense_13 (Dense)             (None, 20)                20500     
=================================================================
Total params: 7,058,004
Trainable params: 20,500
Non-trainable params: 7,037,504
_________________________________________________________________
Epoch 1/100
266/266 [==============================] - 10s 17ms/step - loss: 4.3131 - accuracy: 0.0646 - val_loss: 2.8074 - val_accuracy: 0.1742
Epoch 2/100
266/266 [==============================] - 3s 11ms/step - loss: 3.3766 - accuracy: 0.1448 - val_loss: 2.3701 - val_accuracy: 0.2879
Epoch 3/100
266/266 [==============================] - 3s 11ms/step - loss: 2.8992 - accuracy: 0.2019 - val_loss: 2.1102 - val_accuracy: 0.3557
Epoch 4/100
266/266 [==============================] - 3s 11ms/step - loss: 2.5720 - accuracy: 0.2731 - val_loss: 1.9477 - val_accuracy: 0.4096
Epoch 5/100
266/266 [==============================] - 3s 11ms/step - loss: 2.3820 - accuracy: 0.3105 - val_loss: 1.8350 - val_accuracy: 0.4588
Epoch 6/100
266/266 [==============================] - 3s 11ms/step - loss: 2.2267 - accuracy: 0.3519 - val_loss: 1.7531 - val_accuracy: 0.4867
Epoch 7/100
266/266 [==============================] - 3s 11ms/step - loss: 2.0775 - accuracy: 0.3767 - val_loss: 1.6893 - val_accuracy: 0.5033
Epoch 8/100
266/266 [==============================] - 3s 11ms/step - loss: 1.9981 - accuracy: 0.3986 - val_loss: 1.6360 - val_accuracy: 0.5153
Epoch 9/100
266/266 [==============================] - 3s 11ms/step - loss: 1.9509 - accuracy: 0.4157 - val_loss: 1.5927 - val_accuracy: 0.5180
Epoch 10/100
266/266 [==============================] - 3s 11ms/step - loss: 1.8338 - accuracy: 0.4427 - val_loss: 1.5577 - val_accuracy: 0.5432
Epoch 11/100
266/266 [==============================] - 3s 11ms/step - loss: 1.7955 - accuracy: 0.4610 - val_loss: 1.5318 - val_accuracy: 0.5439
Epoch 12/100
266/266 [==============================] - 3s 11ms/step - loss: 1.7925 - accuracy: 0.4660 - val_loss: 1.5035 - val_accuracy: 0.5578
Epoch 13/100
266/266 [==============================] - 3s 11ms/step - loss: 1.7180 - accuracy: 0.4762 - val_loss: 1.4812 - val_accuracy: 0.5585
Epoch 14/100
266/266 [==============================] - 3s 11ms/step - loss: 1.6743 - accuracy: 0.4943 - val_loss: 1.4622 - val_accuracy: 0.5652
Epoch 15/100
266/266 [==============================] - 3s 11ms/step - loss: 1.6204 - accuracy: 0.5046 - val_loss: 1.4477 - val_accuracy: 0.5765
Epoch 16/100
266/266 [==============================] - 3s 11ms/step - loss: 1.6030 - accuracy: 0.5173 - val_loss: 1.4291 - val_accuracy: 0.5778
Epoch 17/100
266/266 [==============================] - 3s 11ms/step - loss: 1.6329 - accuracy: 0.5051 - val_loss: 1.4119 - val_accuracy: 0.5878
Epoch 18/100
266/266 [==============================] - 3s 11ms/step - loss: 1.5593 - accuracy: 0.5225 - val_loss: 1.3960 - val_accuracy: 0.5891
Epoch 19/100
266/266 [==============================] - 3s 11ms/step - loss: 1.5581 - accuracy: 0.5175 - val_loss: 1.3845 - val_accuracy: 0.5957
Epoch 20/100
266/266 [==============================] - 3s 11ms/step - loss: 1.5026 - accuracy: 0.5354 - val_loss: 1.3811 - val_accuracy: 0.6017
Epoch 21/100
266/266 [==============================] - 3s 11ms/step - loss: 1.5089 - accuracy: 0.5339 - val_loss: 1.3679 - val_accuracy: 0.5984
Epoch 22/100
266/266 [==============================] - 3s 11ms/step - loss: 1.4960 - accuracy: 0.5503 - val_loss: 1.3617 - val_accuracy: 0.6011
Epoch 23/100
266/266 [==============================] - 3s 11ms/step - loss: 1.4750 - accuracy: 0.5462 - val_loss: 1.3531 - val_accuracy: 0.6024
Epoch 24/100
266/266 [==============================] - 3s 11ms/step - loss: 1.4819 - accuracy: 0.5335 - val_loss: 1.3403 - val_accuracy: 0.6057
Epoch 25/100
266/266 [==============================] - 3s 11ms/step - loss: 1.4441 - accuracy: 0.5594 - val_loss: 1.3370 - val_accuracy: 0.6070
Epoch 26/100
266/266 [==============================] - 3s 11ms/step - loss: 1.4324 - accuracy: 0.5562 - val_loss: 1.3320 - val_accuracy: 0.6104
Epoch 27/100
266/266 [==============================] - 3s 11ms/step - loss: 1.3882 - accuracy: 0.5724 - val_loss: 1.3262 - val_accuracy: 0.6144
Epoch 28/100
266/266 [==============================] - 3s 11ms/step - loss: 1.4263 - accuracy: 0.5613 - val_loss: 1.3169 - val_accuracy: 0.6104
Epoch 29/100
266/266 [==============================] - 3s 11ms/step - loss: 1.4008 - accuracy: 0.5683 - val_loss: 1.3189 - val_accuracy: 0.6097
Epoch 30/100
266/266 [==============================] - 3s 11ms/step - loss: 1.4207 - accuracy: 0.5661 - val_loss: 1.3085 - val_accuracy: 0.6110
Epoch 31/100
266/266 [==============================] - 3s 11ms/step - loss: 1.4022 - accuracy: 0.5674 - val_loss: 1.3015 - val_accuracy: 0.6210
Epoch 32/100
266/266 [==============================] - 3s 11ms/step - loss: 1.3689 - accuracy: 0.5773 - val_loss: 1.3017 - val_accuracy: 0.6150
Epoch 33/100
266/266 [==============================] - 3s 11ms/step - loss: 1.3609 - accuracy: 0.5838 - val_loss: 1.2993 - val_accuracy: 0.6184
Epoch 34/100
266/266 [==============================] - 3s 11ms/step - loss: 1.3525 - accuracy: 0.5808 - val_loss: 1.2961 - val_accuracy: 0.6164
Epoch 35/100
266/266 [==============================] - 3s 11ms/step - loss: 1.3486 - accuracy: 0.5826 - val_loss: 1.2853 - val_accuracy: 0.6230
Epoch 36/100
266/266 [==============================] - 3s 11ms/step - loss: 1.3346 - accuracy: 0.5864 - val_loss: 1.2865 - val_accuracy: 0.6144
Epoch 37/100
266/266 [==============================] - 3s 11ms/step - loss: 1.3470 - accuracy: 0.5834 - val_loss: 1.2853 - val_accuracy: 0.6237
Epoch 38/100
266/266 [==============================] - 3s 11ms/step - loss: 1.3280 - accuracy: 0.5949 - val_loss: 1.2787 - val_accuracy: 0.6257
Epoch 39/100
266/266 [==============================] - 3s 11ms/step - loss: 1.3511 - accuracy: 0.5817 - val_loss: 1.2767 - val_accuracy: 0.6190
Epoch 40/100
266/266 [==============================] - 3s 11ms/step - loss: 1.3279 - accuracy: 0.6018 - val_loss: 1.2800 - val_accuracy: 0.6157
Epoch 41/100
266/266 [==============================] - 3s 11ms/step - loss: 1.3193 - accuracy: 0.5945 - val_loss: 1.2800 - val_accuracy: 0.6217
Epoch 42/100
266/266 [==============================] - 3s 11ms/step - loss: 1.3094 - accuracy: 0.5932 - val_loss: 1.2761 - val_accuracy: 0.6184
Epoch 43/100
266/266 [==============================] - 3s 11ms/step - loss: 1.2994 - accuracy: 0.5998 - val_loss: 1.2727 - val_accuracy: 0.6157
Epoch 44/100
266/266 [==============================] - 3s 11ms/step - loss: 1.2678 - accuracy: 0.6105 - val_loss: 1.2683 - val_accuracy: 0.6230
Epoch 45/100
266/266 [==============================] - 3s 11ms/step - loss: 1.3063 - accuracy: 0.6004 - val_loss: 1.2665 - val_accuracy: 0.6223
Epoch 46/100
266/266 [==============================] - 3s 11ms/step - loss: 1.3250 - accuracy: 0.5963 - val_loss: 1.2662 - val_accuracy: 0.6250
Epoch 47/100
266/266 [==============================] - 3s 12ms/step - loss: 1.3240 - accuracy: 0.5948 - val_loss: 1.2568 - val_accuracy: 0.6277
Epoch 48/100
266/266 [==============================] - 3s 11ms/step - loss: 1.2981 - accuracy: 0.5914 - val_loss: 1.2601 - val_accuracy: 0.6270
Epoch 49/100
266/266 [==============================] - 3s 11ms/step - loss: 1.2781 - accuracy: 0.6003 - val_loss: 1.2559 - val_accuracy: 0.6270
Epoch 50/100
266/266 [==============================] - 3s 11ms/step - loss: 1.3070 - accuracy: 0.5981 - val_loss: 1.2567 - val_accuracy: 0.6230
Epoch 51/100
266/266 [==============================] - 3s 11ms/step - loss: 1.2732 - accuracy: 0.6079 - val_loss: 1.2513 - val_accuracy: 0.6263
Epoch 52/100
266/266 [==============================] - 3s 11ms/step - loss: 1.2810 - accuracy: 0.6062 - val_loss: 1.2537 - val_accuracy: 0.6257
Epoch 53/100
266/266 [==============================] - 3s 11ms/step - loss: 1.3142 - accuracy: 0.5982 - val_loss: 1.2570 - val_accuracy: 0.6217
Epoch 54/100
266/266 [==============================] - 3s 11ms/step - loss: 1.2742 - accuracy: 0.6025 - val_loss: 1.2453 - val_accuracy: 0.6237
Epoch 55/100
266/266 [==============================] - 3s 11ms/step - loss: 1.2971 - accuracy: 0.6018 - val_loss: 1.2476 - val_accuracy: 0.6250
Epoch 56/100
266/266 [==============================] - 3s 11ms/step - loss: 1.2931 - accuracy: 0.6035 - val_loss: 1.2425 - val_accuracy: 0.6283
Epoch 57/100
266/266 [==============================] - 3s 11ms/step - loss: 1.2749 - accuracy: 0.6068 - val_loss: 1.2438 - val_accuracy: 0.6263
Epoch 58/100
266/266 [==============================] - 3s 11ms/step - loss: 1.2482 - accuracy: 0.6119 - val_loss: 1.2434 - val_accuracy: 0.6270
Epoch 59/100
266/266 [==============================] - 3s 11ms/step - loss: 1.2696 - accuracy: 0.6074 - val_loss: 1.2459 - val_accuracy: 0.6303
Epoch 60/100
266/266 [==============================] - 3s 11ms/step - loss: 1.2756 - accuracy: 0.6065 - val_loss: 1.2413 - val_accuracy: 0.6316
Epoch 61/100
266/266 [==============================] - 3s 11ms/step - loss: 1.2567 - accuracy: 0.6046 - val_loss: 1.2397 - val_accuracy: 0.6283
Epoch 62/100
266/266 [==============================] - 3s 11ms/step - loss: 1.2711 - accuracy: 0.6009 - val_loss: 1.2412 - val_accuracy: 0.6316
Epoch 63/100
266/266 [==============================] - 3s 11ms/step - loss: 1.2500 - accuracy: 0.6204 - val_loss: 1.2376 - val_accuracy: 0.6356
Epoch 64/100
266/266 [==============================] - 3s 11ms/step - loss: 1.2741 - accuracy: 0.6101 - val_loss: 1.2409 - val_accuracy: 0.6350
Epoch 65/100
266/266 [==============================] - 3s 11ms/step - loss: 1.2502 - accuracy: 0.6100 - val_loss: 1.2325 - val_accuracy: 0.6390
Epoch 66/100
266/266 [==============================] - 3s 11ms/step - loss: 1.2403 - accuracy: 0.6098 - val_loss: 1.2387 - val_accuracy: 0.6330
Epoch 67/100
266/266 [==============================] - 3s 11ms/step - loss: 1.2445 - accuracy: 0.6112 - val_loss: 1.2408 - val_accuracy: 0.6316
Epoch 68/100
266/266 [==============================] - 3s 11ms/step - loss: 1.2586 - accuracy: 0.6112 - val_loss: 1.2309 - val_accuracy: 0.6336
Epoch 69/100
266/266 [==============================] - 3s 11ms/step - loss: 1.2524 - accuracy: 0.6031 - val_loss: 1.2338 - val_accuracy: 0.6383
Epoch 70/100
266/266 [==============================] - 3s 11ms/step - loss: 1.2393 - accuracy: 0.6123 - val_loss: 1.2369 - val_accuracy: 0.6323
Epoch 71/100
266/266 [==============================] - 3s 11ms/step - loss: 1.2565 - accuracy: 0.6085 - val_loss: 1.2375 - val_accuracy: 0.6257
Epoch 72/100
266/266 [==============================] - 3s 11ms/step - loss: 1.2364 - accuracy: 0.6206 - val_loss: 1.2326 - val_accuracy: 0.6297
Epoch 73/100
266/266 [==============================] - 3s 11ms/step - loss: 1.2344 - accuracy: 0.6114 - val_loss: 1.2375 - val_accuracy: 0.6316
Epoch 74/100
266/266 [==============================] - 3s 11ms/step - loss: 1.2282 - accuracy: 0.6149 - val_loss: 1.2319 - val_accuracy: 0.6363
Epoch 75/100
266/266 [==============================] - 3s 11ms/step - loss: 1.2493 - accuracy: 0.6169 - val_loss: 1.2375 - val_accuracy: 0.6263
Epoch 76/100
266/266 [==============================] - 3s 11ms/step - loss: 1.2470 - accuracy: 0.6123 - val_loss: 1.2324 - val_accuracy: 0.6310
Epoch 77/100
266/266 [==============================] - 3s 11ms/step - loss: 1.2360 - accuracy: 0.6185 - val_loss: 1.2294 - val_accuracy: 0.6403
Epoch 78/100
266/266 [==============================] - 3s 11ms/step - loss: 1.2377 - accuracy: 0.6094 - val_loss: 1.2319 - val_accuracy: 0.6316
Epoch 79/100
266/266 [==============================] - 3s 11ms/step - loss: 1.2413 - accuracy: 0.6107 - val_loss: 1.2296 - val_accuracy: 0.6343
Epoch 80/100
266/266 [==============================] - 3s 11ms/step - loss: 1.2540 - accuracy: 0.6020 - val_loss: 1.2296 - val_accuracy: 0.6383
Epoch 81/100
266/266 [==============================] - 3s 11ms/step - loss: 1.2309 - accuracy: 0.6201 - val_loss: 1.2331 - val_accuracy: 0.6283
Epoch 82/100
266/266 [==============================] - 3s 11ms/step - loss: 1.2364 - accuracy: 0.6145 - val_loss: 1.2273 - val_accuracy: 0.6297
Epoch 83/100
266/266 [==============================] - 3s 11ms/step - loss: 1.2281 - accuracy: 0.6262 - val_loss: 1.2315 - val_accuracy: 0.6323
Epoch 84/100
266/266 [==============================] - 3s 11ms/step - loss: 1.2273 - accuracy: 0.6148 - val_loss: 1.2260 - val_accuracy: 0.6297
Epoch 85/100
266/266 [==============================] - 3s 11ms/step - loss: 1.2273 - accuracy: 0.6199 - val_loss: 1.2292 - val_accuracy: 0.6350
Epoch 86/100
266/266 [==============================] - 3s 11ms/step - loss: 1.2607 - accuracy: 0.6108 - val_loss: 1.2270 - val_accuracy: 0.6336
Epoch 87/100
266/266 [==============================] - 3s 11ms/step - loss: 1.2494 - accuracy: 0.6139 - val_loss: 1.2234 - val_accuracy: 0.6330
Epoch 88/100
266/266 [==============================] - 3s 11ms/step - loss: 1.2374 - accuracy: 0.6181 - val_loss: 1.2269 - val_accuracy: 0.6350
Epoch 89/100
266/266 [==============================] - 3s 11ms/step - loss: 1.2367 - accuracy: 0.6069 - val_loss: 1.2264 - val_accuracy: 0.6330
Epoch 90/100
266/266 [==============================] - 3s 11ms/step - loss: 1.2422 - accuracy: 0.6106 - val_loss: 1.2245 - val_accuracy: 0.6277
Epoch 91/100
266/266 [==============================] - 3s 11ms/step - loss: 1.2164 - accuracy: 0.6175 - val_loss: 1.2274 - val_accuracy: 0.6363
Epoch 92/100
266/266 [==============================] - 3s 12ms/step - loss: 1.2509 - accuracy: 0.6099 - val_loss: 1.2263 - val_accuracy: 0.6356
Epoch 93/100
266/266 [==============================] - 3s 12ms/step - loss: 1.2048 - accuracy: 0.6201 - val_loss: 1.2305 - val_accuracy: 0.6316
Epoch 94/100
266/266 [==============================] - 3s 11ms/step - loss: 1.2066 - accuracy: 0.6256 - val_loss: 1.2224 - val_accuracy: 0.6370
Epoch 95/100
266/266 [==============================] - 3s 11ms/step - loss: 1.2068 - accuracy: 0.6270 - val_loss: 1.2206 - val_accuracy: 0.6343
Epoch 96/100
266/266 [==============================] - 3s 11ms/step - loss: 1.2214 - accuracy: 0.6236 - val_loss: 1.2234 - val_accuracy: 0.6330
Epoch 97/100
266/266 [==============================] - 3s 11ms/step - loss: 1.2428 - accuracy: 0.6061 - val_loss: 1.2238 - val_accuracy: 0.6363
Epoch 98/100
266/266 [==============================] - 3s 11ms/step - loss: 1.2369 - accuracy: 0.6152 - val_loss: 1.2173 - val_accuracy: 0.6390
Epoch 99/100
266/266 [==============================] - 3s 11ms/step - loss: 1.2328 - accuracy: 0.6176 - val_loss: 1.2200 - val_accuracy: 0.6350
Epoch 100/100
266/266 [==============================] - 3s 11ms/step - loss: 1.2559 - accuracy: 0.6131 - val_loss: 1.2216 - val_accuracy: 0.6403
In [ ]:
loss, accuracy = model_report(DENSENET_MODEL, DENSENET_MODEL_history)
accuracies["DENSENET_NONE"] = accuracy
Test set evaluation metrics
---------------------------
Loss:     1.298
Accuracy: 61.756%
Εκπαίδευση κεφαλής ταξινόμησης και ορισμένων συνελικτικών επιπέδων που βρίσκονται κοντά σε αυτή
In [ ]:
# transfer learning: DenseNet trained on ImageNet without the top layer

def init_DENSENET_model(summary, optimizer = tf.optimizers.Adam, lr = 0.00005):
  densenet_model=tf.keras.applications.densenet.DenseNet121(input_shape=(32,32,3), include_top=False, weights='imagenet')
  
  DENSENET_MODEL=densenet_model.layers[0](densenet_model)

  for layer in DENSENET_MODEL.layers[:313]:
    layer.trainable=False
  for layer in DENSENET_MODEL.layers[313:]:
    layer.trainable=True

  dropout_layer = tf.keras.layers.Dropout(rate = 0.5)
  global_average_layer = tf.keras.layers.GlobalAveragePooling2D()

  # add top layer for CIFAR100 classification
  prediction_layer = tf.keras.layers.Dense(CLASSES_NUM,activation='softmax')
  model = tf.keras.Sequential([DENSENET_MODEL, dropout_layer, global_average_layer, prediction_layer])
  model.compile(optimizer=optimizer(learning_rate = lr), loss=tf.keras.losses.sparse_categorical_crossentropy, metrics=["accuracy"])
  if summary: 
    model.summary()
  return model
In [ ]:
DENSENET_MODEL = init_DENSENET_model(True)
DENSENET_MODEL_history = train_model(DENSENET_MODEL)
Model: "sequential_20"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
densenet121 (Functional)     (None, 1, 1, 1024)        7037504   
_________________________________________________________________
dropout_20 (Dropout)         (None, 1, 1, 1024)        0         
_________________________________________________________________
global_average_pooling2d_20  (None, 1024)              0         
_________________________________________________________________
dense_20 (Dense)             (None, 20)                20500     
=================================================================
Total params: 7,058,004
Trainable params: 2,180,628
Non-trainable params: 4,877,376
_________________________________________________________________
Epoch 1/100
266/266 [==============================] - 20s 24ms/step - loss: 4.1380 - accuracy: 0.1036 - val_loss: 2.1078 - val_accuracy: 0.4056
Epoch 2/100
266/266 [==============================] - 5s 18ms/step - loss: 2.4761 - accuracy: 0.3425 - val_loss: 1.7077 - val_accuracy: 0.5080
Epoch 3/100
266/266 [==============================] - 5s 18ms/step - loss: 2.0334 - accuracy: 0.4407 - val_loss: 1.5528 - val_accuracy: 0.5379
Epoch 4/100
266/266 [==============================] - 5s 18ms/step - loss: 1.7298 - accuracy: 0.4994 - val_loss: 1.4639 - val_accuracy: 0.5658
Epoch 5/100
266/266 [==============================] - 5s 18ms/step - loss: 1.5500 - accuracy: 0.5471 - val_loss: 1.4091 - val_accuracy: 0.5818
Epoch 6/100
266/266 [==============================] - 5s 18ms/step - loss: 1.3839 - accuracy: 0.5837 - val_loss: 1.3540 - val_accuracy: 0.6031
Epoch 7/100
266/266 [==============================] - 5s 18ms/step - loss: 1.2574 - accuracy: 0.6166 - val_loss: 1.3197 - val_accuracy: 0.6084
Epoch 8/100
266/266 [==============================] - 5s 18ms/step - loss: 1.1882 - accuracy: 0.6336 - val_loss: 1.2958 - val_accuracy: 0.6190
Epoch 9/100
266/266 [==============================] - 5s 18ms/step - loss: 1.1062 - accuracy: 0.6562 - val_loss: 1.2818 - val_accuracy: 0.6230
Epoch 10/100
266/266 [==============================] - 5s 18ms/step - loss: 1.0218 - accuracy: 0.6784 - val_loss: 1.2688 - val_accuracy: 0.6270
Epoch 11/100
266/266 [==============================] - 5s 18ms/step - loss: 0.9345 - accuracy: 0.7077 - val_loss: 1.2462 - val_accuracy: 0.6376
Epoch 12/100
266/266 [==============================] - 5s 18ms/step - loss: 0.8221 - accuracy: 0.7417 - val_loss: 1.2482 - val_accuracy: 0.6396
Epoch 13/100
266/266 [==============================] - 5s 18ms/step - loss: 0.7946 - accuracy: 0.7483 - val_loss: 1.2388 - val_accuracy: 0.6436
Epoch 14/100
266/266 [==============================] - 5s 18ms/step - loss: 0.7442 - accuracy: 0.7591 - val_loss: 1.2360 - val_accuracy: 0.6476
Epoch 15/100
266/266 [==============================] - 5s 17ms/step - loss: 0.6640 - accuracy: 0.7897 - val_loss: 1.2230 - val_accuracy: 0.6503
Epoch 16/100
266/266 [==============================] - 5s 18ms/step - loss: 0.6226 - accuracy: 0.8020 - val_loss: 1.2343 - val_accuracy: 0.6516
Epoch 17/100
266/266 [==============================] - 5s 18ms/step - loss: 0.6012 - accuracy: 0.8162 - val_loss: 1.2323 - val_accuracy: 0.6443
Epoch 18/100
266/266 [==============================] - 5s 18ms/step - loss: 0.5386 - accuracy: 0.8283 - val_loss: 1.2288 - val_accuracy: 0.6536
Epoch 19/100
266/266 [==============================] - 5s 18ms/step - loss: 0.5281 - accuracy: 0.8300 - val_loss: 1.2316 - val_accuracy: 0.6556
Epoch 20/100
266/266 [==============================] - 5s 18ms/step - loss: 0.4698 - accuracy: 0.8504 - val_loss: 1.2444 - val_accuracy: 0.6496
Epoch 21/100
266/266 [==============================] - 5s 18ms/step - loss: 0.4294 - accuracy: 0.8657 - val_loss: 1.2444 - val_accuracy: 0.6562
Epoch 22/100
266/266 [==============================] - 5s 18ms/step - loss: 0.3932 - accuracy: 0.8764 - val_loss: 1.2536 - val_accuracy: 0.6562
Epoch 23/100
266/266 [==============================] - 5s 18ms/step - loss: 0.3723 - accuracy: 0.8887 - val_loss: 1.2696 - val_accuracy: 0.6622
Epoch 24/100
266/266 [==============================] - 5s 17ms/step - loss: 0.3686 - accuracy: 0.8843 - val_loss: 1.2722 - val_accuracy: 0.6676
Epoch 25/100
266/266 [==============================] - 5s 17ms/step - loss: 0.3265 - accuracy: 0.9009 - val_loss: 1.2819 - val_accuracy: 0.6602
Epoch 26/100
266/266 [==============================] - 5s 18ms/step - loss: 0.2945 - accuracy: 0.9093 - val_loss: 1.3039 - val_accuracy: 0.6562
Epoch 27/100
266/266 [==============================] - 5s 18ms/step - loss: 0.2845 - accuracy: 0.9171 - val_loss: 1.3185 - val_accuracy: 0.6569
Epoch 28/100
266/266 [==============================] - 5s 18ms/step - loss: 0.2519 - accuracy: 0.9260 - val_loss: 1.3378 - val_accuracy: 0.6523
Epoch 29/100
266/266 [==============================] - 5s 18ms/step - loss: 0.2405 - accuracy: 0.9281 - val_loss: 1.3395 - val_accuracy: 0.6529
Epoch 30/100
266/266 [==============================] - 5s 18ms/step - loss: 0.2169 - accuracy: 0.9375 - val_loss: 1.3599 - val_accuracy: 0.6569
Epoch 31/100
266/266 [==============================] - 5s 18ms/step - loss: 0.2212 - accuracy: 0.9323 - val_loss: 1.3652 - val_accuracy: 0.6562
Epoch 32/100
266/266 [==============================] - 5s 18ms/step - loss: 0.1986 - accuracy: 0.9447 - val_loss: 1.3780 - val_accuracy: 0.6609
Epoch 33/100
266/266 [==============================] - 5s 17ms/step - loss: 0.1965 - accuracy: 0.9404 - val_loss: 1.3775 - val_accuracy: 0.6636
Epoch 34/100
266/266 [==============================] - 5s 18ms/step - loss: 0.1806 - accuracy: 0.9443 - val_loss: 1.3869 - val_accuracy: 0.6695
Epoch 35/100
266/266 [==============================] - 5s 17ms/step - loss: 0.1678 - accuracy: 0.9521 - val_loss: 1.3959 - val_accuracy: 0.6616
Epoch 36/100
266/266 [==============================] - 5s 17ms/step - loss: 0.1457 - accuracy: 0.9573 - val_loss: 1.4195 - val_accuracy: 0.6609
Epoch 37/100
266/266 [==============================] - 5s 17ms/step - loss: 0.1392 - accuracy: 0.9609 - val_loss: 1.4338 - val_accuracy: 0.6562
Epoch 38/100
266/266 [==============================] - 5s 17ms/step - loss: 0.1282 - accuracy: 0.9630 - val_loss: 1.4444 - val_accuracy: 0.6642
Epoch 39/100
266/266 [==============================] - 5s 17ms/step - loss: 0.1300 - accuracy: 0.9651 - val_loss: 1.4250 - val_accuracy: 0.6636
Epoch 40/100
266/266 [==============================] - 4s 17ms/step - loss: 0.1223 - accuracy: 0.9639 - val_loss: 1.4606 - val_accuracy: 0.6636
Epoch 41/100
266/266 [==============================] - 4s 17ms/step - loss: 0.1155 - accuracy: 0.9666 - val_loss: 1.4749 - val_accuracy: 0.6596
Epoch 42/100
266/266 [==============================] - 5s 17ms/step - loss: 0.1018 - accuracy: 0.9733 - val_loss: 1.4784 - val_accuracy: 0.6636
Epoch 43/100
266/266 [==============================] - 5s 17ms/step - loss: 0.0975 - accuracy: 0.9739 - val_loss: 1.4783 - val_accuracy: 0.6642
Epoch 44/100
266/266 [==============================] - 5s 17ms/step - loss: 0.0861 - accuracy: 0.9777 - val_loss: 1.4760 - val_accuracy: 0.6662
Epoch 45/100
266/266 [==============================] - 5s 17ms/step - loss: 0.0941 - accuracy: 0.9741 - val_loss: 1.4815 - val_accuracy: 0.6729
Epoch 46/100
266/266 [==============================] - 5s 17ms/step - loss: 0.0812 - accuracy: 0.9782 - val_loss: 1.4860 - val_accuracy: 0.6755
Epoch 47/100
266/266 [==============================] - 5s 17ms/step - loss: 0.0812 - accuracy: 0.9780 - val_loss: 1.5060 - val_accuracy: 0.6576
Epoch 48/100
266/266 [==============================] - 4s 17ms/step - loss: 0.0770 - accuracy: 0.9803 - val_loss: 1.5116 - val_accuracy: 0.6636
Epoch 49/100
266/266 [==============================] - 5s 17ms/step - loss: 0.0692 - accuracy: 0.9839 - val_loss: 1.5047 - val_accuracy: 0.6702
Epoch 50/100
266/266 [==============================] - 5s 17ms/step - loss: 0.0671 - accuracy: 0.9830 - val_loss: 1.5341 - val_accuracy: 0.6649
Epoch 51/100
266/266 [==============================] - 5s 17ms/step - loss: 0.0620 - accuracy: 0.9850 - val_loss: 1.5512 - val_accuracy: 0.6616
Epoch 52/100
266/266 [==============================] - 4s 17ms/step - loss: 0.0725 - accuracy: 0.9781 - val_loss: 1.5538 - val_accuracy: 0.6602
Epoch 53/100
266/266 [==============================] - 5s 17ms/step - loss: 0.0609 - accuracy: 0.9845 - val_loss: 1.5622 - val_accuracy: 0.6689
Epoch 54/100
266/266 [==============================] - 5s 17ms/step - loss: 0.0663 - accuracy: 0.9815 - val_loss: 1.5714 - val_accuracy: 0.6642
Epoch 55/100
266/266 [==============================] - 5s 17ms/step - loss: 0.0568 - accuracy: 0.9860 - val_loss: 1.5876 - val_accuracy: 0.6682
Epoch 56/100
266/266 [==============================] - 4s 17ms/step - loss: 0.0583 - accuracy: 0.9858 - val_loss: 1.6072 - val_accuracy: 0.6609
Epoch 57/100
266/266 [==============================] - 4s 17ms/step - loss: 0.0576 - accuracy: 0.9852 - val_loss: 1.6135 - val_accuracy: 0.6636
Epoch 58/100
266/266 [==============================] - 5s 17ms/step - loss: 0.0552 - accuracy: 0.9839 - val_loss: 1.6176 - val_accuracy: 0.6596
Epoch 59/100
266/266 [==============================] - 5s 17ms/step - loss: 0.0525 - accuracy: 0.9848 - val_loss: 1.6063 - val_accuracy: 0.6656
Epoch 60/100
266/266 [==============================] - 4s 17ms/step - loss: 0.0422 - accuracy: 0.9894 - val_loss: 1.6089 - val_accuracy: 0.6709
Epoch 61/100
266/266 [==============================] - 4s 17ms/step - loss: 0.0572 - accuracy: 0.9834 - val_loss: 1.6265 - val_accuracy: 0.6622
Epoch 62/100
266/266 [==============================] - 5s 17ms/step - loss: 0.0427 - accuracy: 0.9892 - val_loss: 1.6348 - val_accuracy: 0.6636
Epoch 63/100
266/266 [==============================] - 5s 17ms/step - loss: 0.0460 - accuracy: 0.9867 - val_loss: 1.6386 - val_accuracy: 0.6556
Epoch 64/100
266/266 [==============================] - 5s 17ms/step - loss: 0.0380 - accuracy: 0.9906 - val_loss: 1.6455 - val_accuracy: 0.6622
Epoch 65/100
266/266 [==============================] - 5s 17ms/step - loss: 0.0396 - accuracy: 0.9915 - val_loss: 1.6528 - val_accuracy: 0.6722
Epoch 66/100
266/266 [==============================] - 5s 17ms/step - loss: 0.0460 - accuracy: 0.9860 - val_loss: 1.6577 - val_accuracy: 0.6649
Epoch 67/100
266/266 [==============================] - 5s 17ms/step - loss: 0.0437 - accuracy: 0.9902 - val_loss: 1.6668 - val_accuracy: 0.6622
Epoch 68/100
266/266 [==============================] - 5s 17ms/step - loss: 0.0438 - accuracy: 0.9852 - val_loss: 1.7120 - val_accuracy: 0.6596
Epoch 69/100
266/266 [==============================] - 5s 17ms/step - loss: 0.0430 - accuracy: 0.9887 - val_loss: 1.6773 - val_accuracy: 0.6622
Epoch 70/100
266/266 [==============================] - 5s 17ms/step - loss: 0.0349 - accuracy: 0.9926 - val_loss: 1.6791 - val_accuracy: 0.6695
Epoch 71/100
266/266 [==============================] - 5s 17ms/step - loss: 0.0314 - accuracy: 0.9930 - val_loss: 1.7069 - val_accuracy: 0.6562
Epoch 72/100
266/266 [==============================] - 5s 17ms/step - loss: 0.0462 - accuracy: 0.9843 - val_loss: 1.6968 - val_accuracy: 0.6549
Epoch 73/100
266/266 [==============================] - 5s 17ms/step - loss: 0.0410 - accuracy: 0.9890 - val_loss: 1.7189 - val_accuracy: 0.6523
Epoch 74/100
266/266 [==============================] - 5s 17ms/step - loss: 0.0323 - accuracy: 0.9906 - val_loss: 1.7390 - val_accuracy: 0.6543
Epoch 75/100
266/266 [==============================] - 5s 18ms/step - loss: 0.0301 - accuracy: 0.9927 - val_loss: 1.7581 - val_accuracy: 0.6509
Epoch 76/100
266/266 [==============================] - 5s 17ms/step - loss: 0.0281 - accuracy: 0.9924 - val_loss: 1.7541 - val_accuracy: 0.6569
Epoch 77/100
266/266 [==============================] - 5s 17ms/step - loss: 0.0276 - accuracy: 0.9932 - val_loss: 1.7660 - val_accuracy: 0.6622
Epoch 78/100
266/266 [==============================] - 5s 17ms/step - loss: 0.0331 - accuracy: 0.9904 - val_loss: 1.7823 - val_accuracy: 0.6662
Epoch 79/100
266/266 [==============================] - 5s 17ms/step - loss: 0.0300 - accuracy: 0.9911 - val_loss: 1.7810 - val_accuracy: 0.6649
Epoch 80/100
266/266 [==============================] - 5s 17ms/step - loss: 0.0337 - accuracy: 0.9896 - val_loss: 1.7871 - val_accuracy: 0.6582
Epoch 81/100
266/266 [==============================] - 5s 17ms/step - loss: 0.0339 - accuracy: 0.9902 - val_loss: 1.8132 - val_accuracy: 0.6556
Epoch 82/100
266/266 [==============================] - 5s 18ms/step - loss: 0.0294 - accuracy: 0.9922 - val_loss: 1.7909 - val_accuracy: 0.6569
Epoch 83/100
266/266 [==============================] - 5s 17ms/step - loss: 0.0306 - accuracy: 0.9904 - val_loss: 1.7777 - val_accuracy: 0.6556
Epoch 84/100
266/266 [==============================] - 5s 17ms/step - loss: 0.0278 - accuracy: 0.9926 - val_loss: 1.8000 - val_accuracy: 0.6496
Epoch 85/100
266/266 [==============================] - 5s 17ms/step - loss: 0.0275 - accuracy: 0.9930 - val_loss: 1.8331 - val_accuracy: 0.6529
Epoch 86/100
266/266 [==============================] - 5s 17ms/step - loss: 0.0263 - accuracy: 0.9934 - val_loss: 1.8388 - val_accuracy: 0.6523
Epoch 87/100
266/266 [==============================] - 5s 17ms/step - loss: 0.0276 - accuracy: 0.9921 - val_loss: 1.8125 - val_accuracy: 0.6609
Epoch 88/100
266/266 [==============================] - 5s 17ms/step - loss: 0.0306 - accuracy: 0.9912 - val_loss: 1.8105 - val_accuracy: 0.6543
Epoch 89/100
266/266 [==============================] - 5s 17ms/step - loss: 0.0351 - accuracy: 0.9897 - val_loss: 1.8171 - val_accuracy: 0.6556
Epoch 90/100
266/266 [==============================] - 5s 18ms/step - loss: 0.0256 - accuracy: 0.9932 - val_loss: 1.8572 - val_accuracy: 0.6543
Epoch 91/100
266/266 [==============================] - 5s 17ms/step - loss: 0.0278 - accuracy: 0.9929 - val_loss: 1.8685 - val_accuracy: 0.6456
Epoch 92/100
266/266 [==============================] - 5s 17ms/step - loss: 0.0229 - accuracy: 0.9928 - val_loss: 1.8961 - val_accuracy: 0.6443
Epoch 93/100
266/266 [==============================] - 5s 17ms/step - loss: 0.0225 - accuracy: 0.9941 - val_loss: 1.9056 - val_accuracy: 0.6463
Epoch 94/100
266/266 [==============================] - 5s 17ms/step - loss: 0.0230 - accuracy: 0.9934 - val_loss: 1.8990 - val_accuracy: 0.6602
Epoch 95/100
266/266 [==============================] - 5s 17ms/step - loss: 0.0227 - accuracy: 0.9943 - val_loss: 1.8701 - val_accuracy: 0.6569
Epoch 96/100
266/266 [==============================] - 5s 17ms/step - loss: 0.0218 - accuracy: 0.9940 - val_loss: 1.9155 - val_accuracy: 0.6469
Epoch 97/100
266/266 [==============================] - 5s 17ms/step - loss: 0.0219 - accuracy: 0.9942 - val_loss: 1.8899 - val_accuracy: 0.6562
Epoch 98/100
266/266 [==============================] - 5s 18ms/step - loss: 0.0249 - accuracy: 0.9919 - val_loss: 1.8864 - val_accuracy: 0.6596
Epoch 99/100
266/266 [==============================] - 5s 17ms/step - loss: 0.0240 - accuracy: 0.9936 - val_loss: 1.8650 - val_accuracy: 0.6602
Epoch 100/100
266/266 [==============================] - 5s 17ms/step - loss: 0.0268 - accuracy: 0.9922 - val_loss: 1.8895 - val_accuracy: 0.6536
In [ ]:
loss, accuracy = model_report(DENSENET_MODEL, DENSENET_MODEL_history)
accuracies["DENSENET_FEW"] = accuracy
Test set evaluation metrics
---------------------------
Loss:     1.933
Accuracy: 64.435%
Εκπαίδευση ολόκληρου του δικτύου
In [ ]:
# transfer learning: DenseNet trained on ImageNet without the top layer

def init_DENSENET_model(summary, optimizer = tf.optimizers.Adam, lr = 0.00005):
  densenet_model=tf.keras.applications.densenet.DenseNet121(input_shape=(32,32,3), include_top=False, weights='imagenet')
  
  DENSENET_MODEL=densenet_model.layers[0](densenet_model)

  # unfreeze conv layers
  DENSENET_MODEL.trainable = True

  dropout_layer = tf.keras.layers.Dropout(rate = 0.5)
  global_average_layer = tf.keras.layers.GlobalAveragePooling2D()

  # add top layer for CIFAR100 classification
  prediction_layer = tf.keras.layers.Dense(CLASSES_NUM,activation='softmax')
  model = tf.keras.Sequential([DENSENET_MODEL, dropout_layer, global_average_layer, prediction_layer])
  model.compile(optimizer=optimizer(learning_rate = lr), loss=tf.keras.losses.sparse_categorical_crossentropy, metrics=["accuracy"])
  if summary: 
    model.summary()
  return model
In [ ]:
DENSENET_MODEL = init_DENSENET_model(True)
DENSENET_MODEL_history = train_model(DENSENET_MODEL)
Model: "sequential_10"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
densenet121 (Functional)     (None, 1, 1, 1024)        7037504   
_________________________________________________________________
dropout_7 (Dropout)          (None, 1, 1, 1024)        0         
_________________________________________________________________
global_average_pooling2d_7 ( (None, 1024)              0         
_________________________________________________________________
dense_14 (Dense)             (None, 20)                20500     
=================================================================
Total params: 7,058,004
Trainable params: 6,974,356
Non-trainable params: 83,648
_________________________________________________________________
Epoch 1/100
266/266 [==============================] - 19s 36ms/step - loss: 3.5545 - accuracy: 0.1570 - val_loss: 1.8660 - val_accuracy: 0.5193
Epoch 2/100
266/266 [==============================] - 8s 30ms/step - loss: 1.7508 - accuracy: 0.4938 - val_loss: 1.1964 - val_accuracy: 0.6562
Epoch 3/100
266/266 [==============================] - 8s 30ms/step - loss: 1.2685 - accuracy: 0.6274 - val_loss: 1.0863 - val_accuracy: 0.6769
Epoch 4/100
266/266 [==============================] - 8s 30ms/step - loss: 0.9641 - accuracy: 0.7147 - val_loss: 0.9463 - val_accuracy: 0.7314
Epoch 5/100
266/266 [==============================] - 8s 30ms/step - loss: 0.7910 - accuracy: 0.7572 - val_loss: 0.8856 - val_accuracy: 0.7460
Epoch 6/100
266/266 [==============================] - 8s 30ms/step - loss: 0.6026 - accuracy: 0.8141 - val_loss: 0.8534 - val_accuracy: 0.7593
Epoch 7/100
266/266 [==============================] - 8s 30ms/step - loss: 0.4992 - accuracy: 0.8535 - val_loss: 0.8748 - val_accuracy: 0.7493
Epoch 8/100
266/266 [==============================] - 8s 30ms/step - loss: 0.4182 - accuracy: 0.8712 - val_loss: 0.8501 - val_accuracy: 0.7566
Epoch 9/100
266/266 [==============================] - 8s 30ms/step - loss: 0.2948 - accuracy: 0.9054 - val_loss: 0.8975 - val_accuracy: 0.7566
Epoch 10/100
266/266 [==============================] - 8s 30ms/step - loss: 0.2520 - accuracy: 0.9218 - val_loss: 0.8724 - val_accuracy: 0.7739
Epoch 11/100
266/266 [==============================] - 8s 30ms/step - loss: 0.2041 - accuracy: 0.9387 - val_loss: 0.8688 - val_accuracy: 0.7799
Epoch 12/100
266/266 [==============================] - 8s 30ms/step - loss: 0.1847 - accuracy: 0.9438 - val_loss: 0.8773 - val_accuracy: 0.7726
Epoch 13/100
266/266 [==============================] - 8s 30ms/step - loss: 0.1715 - accuracy: 0.9476 - val_loss: 0.9898 - val_accuracy: 0.7706
Epoch 14/100
266/266 [==============================] - 8s 30ms/step - loss: 0.1554 - accuracy: 0.9519 - val_loss: 0.9301 - val_accuracy: 0.7753
Epoch 15/100
266/266 [==============================] - 8s 30ms/step - loss: 0.1341 - accuracy: 0.9584 - val_loss: 0.9345 - val_accuracy: 0.7779
Epoch 16/100
266/266 [==============================] - 8s 30ms/step - loss: 0.1323 - accuracy: 0.9583 - val_loss: 0.9478 - val_accuracy: 0.7759
Epoch 17/100
266/266 [==============================] - 8s 30ms/step - loss: 0.1224 - accuracy: 0.9606 - val_loss: 0.9916 - val_accuracy: 0.7726
Epoch 18/100
266/266 [==============================] - 8s 31ms/step - loss: 0.1119 - accuracy: 0.9649 - val_loss: 1.0346 - val_accuracy: 0.7660
Epoch 19/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0936 - accuracy: 0.9710 - val_loss: 0.9991 - val_accuracy: 0.7739
Epoch 20/100
266/266 [==============================] - 8s 30ms/step - loss: 0.1008 - accuracy: 0.9680 - val_loss: 1.0367 - val_accuracy: 0.7680
Epoch 21/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0882 - accuracy: 0.9715 - val_loss: 1.0248 - val_accuracy: 0.7753
Epoch 22/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0870 - accuracy: 0.9716 - val_loss: 0.9576 - val_accuracy: 0.7906
Epoch 23/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0774 - accuracy: 0.9741 - val_loss: 1.0640 - val_accuracy: 0.7586
Epoch 24/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0912 - accuracy: 0.9698 - val_loss: 0.9778 - val_accuracy: 0.7872
Epoch 25/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0744 - accuracy: 0.9745 - val_loss: 0.9926 - val_accuracy: 0.7812
Epoch 26/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0775 - accuracy: 0.9766 - val_loss: 1.1076 - val_accuracy: 0.7573
Epoch 27/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0868 - accuracy: 0.9743 - val_loss: 0.9775 - val_accuracy: 0.7819
Epoch 28/100
266/266 [==============================] - 8s 31ms/step - loss: 0.0662 - accuracy: 0.9788 - val_loss: 1.0132 - val_accuracy: 0.7713
Epoch 29/100
266/266 [==============================] - 8s 31ms/step - loss: 0.0711 - accuracy: 0.9784 - val_loss: 1.0460 - val_accuracy: 0.7806
Epoch 30/100
266/266 [==============================] - 8s 31ms/step - loss: 0.0661 - accuracy: 0.9804 - val_loss: 0.9864 - val_accuracy: 0.7926
Epoch 31/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0529 - accuracy: 0.9840 - val_loss: 1.0795 - val_accuracy: 0.7793
Epoch 32/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0685 - accuracy: 0.9807 - val_loss: 1.0224 - val_accuracy: 0.7906
Epoch 33/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0617 - accuracy: 0.9801 - val_loss: 1.1166 - val_accuracy: 0.7673
Epoch 34/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0853 - accuracy: 0.9740 - val_loss: 1.0552 - val_accuracy: 0.7826
Epoch 35/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0650 - accuracy: 0.9797 - val_loss: 0.9886 - val_accuracy: 0.7819
Epoch 36/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0519 - accuracy: 0.9838 - val_loss: 0.9819 - val_accuracy: 0.7826
Epoch 37/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0721 - accuracy: 0.9772 - val_loss: 1.0589 - val_accuracy: 0.7746
Epoch 38/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0560 - accuracy: 0.9824 - val_loss: 0.9850 - val_accuracy: 0.7812
Epoch 39/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0460 - accuracy: 0.9851 - val_loss: 1.1149 - val_accuracy: 0.7759
Epoch 40/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0366 - accuracy: 0.9858 - val_loss: 1.0154 - val_accuracy: 0.7959
Epoch 41/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0521 - accuracy: 0.9838 - val_loss: 1.0240 - val_accuracy: 0.7886
Epoch 42/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0510 - accuracy: 0.9844 - val_loss: 1.0997 - val_accuracy: 0.7733
Epoch 43/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0375 - accuracy: 0.9865 - val_loss: 1.0954 - val_accuracy: 0.7693
Epoch 44/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0531 - accuracy: 0.9825 - val_loss: 1.1730 - val_accuracy: 0.7779
Epoch 45/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0492 - accuracy: 0.9838 - val_loss: 1.1799 - val_accuracy: 0.7566
Epoch 46/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0622 - accuracy: 0.9810 - val_loss: 1.0972 - val_accuracy: 0.7846
Epoch 47/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0517 - accuracy: 0.9829 - val_loss: 1.1222 - val_accuracy: 0.7799
Epoch 48/100
266/266 [==============================] - 8s 31ms/step - loss: 0.0398 - accuracy: 0.9874 - val_loss: 1.0953 - val_accuracy: 0.7839
Epoch 49/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0353 - accuracy: 0.9888 - val_loss: 1.0400 - val_accuracy: 0.7859
Epoch 50/100
266/266 [==============================] - 8s 31ms/step - loss: 0.0307 - accuracy: 0.9897 - val_loss: 1.0217 - val_accuracy: 0.7872
Epoch 51/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0292 - accuracy: 0.9913 - val_loss: 1.0939 - val_accuracy: 0.7886
Epoch 52/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0448 - accuracy: 0.9852 - val_loss: 1.2408 - val_accuracy: 0.7706
Epoch 53/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0326 - accuracy: 0.9903 - val_loss: 0.9956 - val_accuracy: 0.7985
Epoch 54/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0264 - accuracy: 0.9912 - val_loss: 1.0396 - val_accuracy: 0.7859
Epoch 55/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0311 - accuracy: 0.9908 - val_loss: 1.1061 - val_accuracy: 0.7886
Epoch 56/100
266/266 [==============================] - 8s 31ms/step - loss: 0.0380 - accuracy: 0.9885 - val_loss: 1.0057 - val_accuracy: 0.8072
Epoch 57/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0333 - accuracy: 0.9885 - val_loss: 1.0518 - val_accuracy: 0.7926
Epoch 58/100
266/266 [==============================] - 8s 31ms/step - loss: 0.0475 - accuracy: 0.9848 - val_loss: 1.0538 - val_accuracy: 0.7939
Epoch 59/100
266/266 [==============================] - 8s 31ms/step - loss: 0.0450 - accuracy: 0.9875 - val_loss: 0.9976 - val_accuracy: 0.7899
Epoch 60/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0353 - accuracy: 0.9894 - val_loss: 1.0094 - val_accuracy: 0.7846
Epoch 61/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0323 - accuracy: 0.9890 - val_loss: 1.0083 - val_accuracy: 0.7906
Epoch 62/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0340 - accuracy: 0.9906 - val_loss: 1.0316 - val_accuracy: 0.7939
Epoch 63/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0285 - accuracy: 0.9903 - val_loss: 1.0248 - val_accuracy: 0.7799
Epoch 64/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0281 - accuracy: 0.9904 - val_loss: 1.0591 - val_accuracy: 0.7879
Epoch 65/100
266/266 [==============================] - 8s 31ms/step - loss: 0.0391 - accuracy: 0.9882 - val_loss: 1.2090 - val_accuracy: 0.7699
Epoch 66/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0315 - accuracy: 0.9903 - val_loss: 1.2188 - val_accuracy: 0.7480
Epoch 67/100
266/266 [==============================] - 8s 31ms/step - loss: 0.0269 - accuracy: 0.9927 - val_loss: 1.0073 - val_accuracy: 0.7912
Epoch 68/100
266/266 [==============================] - 8s 31ms/step - loss: 0.0174 - accuracy: 0.9947 - val_loss: 1.2337 - val_accuracy: 0.7706
Epoch 69/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0201 - accuracy: 0.9937 - val_loss: 1.1303 - val_accuracy: 0.7832
Epoch 70/100
266/266 [==============================] - 8s 31ms/step - loss: 0.0246 - accuracy: 0.9926 - val_loss: 1.1031 - val_accuracy: 0.7799
Epoch 71/100
266/266 [==============================] - 8s 31ms/step - loss: 0.0344 - accuracy: 0.9896 - val_loss: 1.0756 - val_accuracy: 0.7852
Epoch 72/100
266/266 [==============================] - 8s 31ms/step - loss: 0.0369 - accuracy: 0.9872 - val_loss: 1.1097 - val_accuracy: 0.7779
Epoch 73/100
266/266 [==============================] - 8s 31ms/step - loss: 0.0246 - accuracy: 0.9923 - val_loss: 1.2000 - val_accuracy: 0.7726
Epoch 74/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0393 - accuracy: 0.9870 - val_loss: 1.1939 - val_accuracy: 0.7646
Epoch 75/100
266/266 [==============================] - 8s 31ms/step - loss: 0.0423 - accuracy: 0.9879 - val_loss: 1.0998 - val_accuracy: 0.7852
Epoch 76/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0248 - accuracy: 0.9927 - val_loss: 1.0704 - val_accuracy: 0.7832
Epoch 77/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0284 - accuracy: 0.9920 - val_loss: 1.0861 - val_accuracy: 0.7839
Epoch 78/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0213 - accuracy: 0.9944 - val_loss: 1.0165 - val_accuracy: 0.7992
Epoch 79/100
266/266 [==============================] - 8s 31ms/step - loss: 0.0164 - accuracy: 0.9941 - val_loss: 1.0431 - val_accuracy: 0.7912
Epoch 80/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0234 - accuracy: 0.9921 - val_loss: 1.1520 - val_accuracy: 0.7846
Epoch 81/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0304 - accuracy: 0.9909 - val_loss: 1.0533 - val_accuracy: 0.8032
Epoch 82/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0267 - accuracy: 0.9914 - val_loss: 1.0951 - val_accuracy: 0.8012
Epoch 83/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0241 - accuracy: 0.9928 - val_loss: 1.1647 - val_accuracy: 0.7846
Epoch 84/100
266/266 [==============================] - 8s 31ms/step - loss: 0.0310 - accuracy: 0.9916 - val_loss: 1.0999 - val_accuracy: 0.7826
Epoch 85/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0297 - accuracy: 0.9895 - val_loss: 1.0652 - val_accuracy: 0.8032
Epoch 86/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0265 - accuracy: 0.9911 - val_loss: 1.0860 - val_accuracy: 0.7926
Epoch 87/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0263 - accuracy: 0.9939 - val_loss: 1.0527 - val_accuracy: 0.8025
Epoch 88/100
266/266 [==============================] - 8s 31ms/step - loss: 0.0228 - accuracy: 0.9923 - val_loss: 1.1183 - val_accuracy: 0.7992
Epoch 89/100
266/266 [==============================] - 8s 31ms/step - loss: 0.0291 - accuracy: 0.9904 - val_loss: 1.0974 - val_accuracy: 0.7899
Epoch 90/100
266/266 [==============================] - 8s 31ms/step - loss: 0.0149 - accuracy: 0.9957 - val_loss: 1.1700 - val_accuracy: 0.7806
Epoch 91/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0155 - accuracy: 0.9941 - val_loss: 1.0649 - val_accuracy: 0.7979
Epoch 92/100
266/266 [==============================] - 8s 31ms/step - loss: 0.0163 - accuracy: 0.9949 - val_loss: 1.2227 - val_accuracy: 0.7680
Epoch 93/100
266/266 [==============================] - 8s 31ms/step - loss: 0.0197 - accuracy: 0.9929 - val_loss: 1.1024 - val_accuracy: 0.7919
Epoch 94/100
266/266 [==============================] - 8s 31ms/step - loss: 0.0353 - accuracy: 0.9908 - val_loss: 1.0977 - val_accuracy: 0.7906
Epoch 95/100
266/266 [==============================] - 8s 31ms/step - loss: 0.0276 - accuracy: 0.9902 - val_loss: 1.0822 - val_accuracy: 0.7912
Epoch 96/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0199 - accuracy: 0.9946 - val_loss: 1.2021 - val_accuracy: 0.7972
Epoch 97/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0187 - accuracy: 0.9946 - val_loss: 1.1502 - val_accuracy: 0.7872
Epoch 98/100
266/266 [==============================] - 8s 30ms/step - loss: 0.0337 - accuracy: 0.9906 - val_loss: 1.1426 - val_accuracy: 0.7959
Epoch 99/100
266/266 [==============================] - 8s 31ms/step - loss: 0.0239 - accuracy: 0.9929 - val_loss: 1.1455 - val_accuracy: 0.7866
Epoch 100/100
266/266 [==============================] - 8s 31ms/step - loss: 0.0202 - accuracy: 0.9943 - val_loss: 1.0401 - val_accuracy: 0.8065
In [ ]:
loss, accuracy = model_report(DENSENET_MODEL, DENSENET_MODEL_history)
losses["DENSENET_ALL"] = loss
accuracies["DENSENET_ALL"] = accuracy 
Test set evaluation metrics
---------------------------
Loss:     1.005
Accuracy: 80.456%

Bar plots σύγκρισης

In [ ]:
# set width of bar
barWidth = 0.15
model_names = ['VGG16', 'MobileNet', 'DenseNet']

# set height of bars
bar1 = [accuracies["VGG_NONE"],accuracies["MOBILENET_NONE"],accuracies["DENSENET_NONE"]]
bar2 = [accuracies["VGG_FEW"],accuracies["MOBILENET_FEW"],accuracies["DENSENET_FEW"]]
bar3 = [accuracies["VGG_ALL"],accuracies["MOBILENET_ALL"],accuracies["DENSENET_ALL"]]

# Set position of bar on X axis
r1 = np.arange(3)
r2 = [x + barWidth for x in r1]
r3 = [x + barWidth for x in r2]

plt.figure(figsize=(12,5))
plt.bar(r1, bar1, color='#003f5c', width=barWidth, edgecolor='white', label = 'Only top layer')
plt.bar(r2, bar2, color='#ffa600', width=barWidth, edgecolor='white', label = 'Only few layers')
plt.bar(r3, bar3, color='#bc5090', width=barWidth, edgecolor='white', label = 'All layers')
plt.xticks([r + barWidth for r in range(3)], model_names)
plt.ylim(bottom=0.1)
plt.legend(loc='best')
plt.title("Experiments on Trainable Layers")
plt.ylabel("Classification Accuracy")
plt.grid(axis="y", linestyle="--")
plt.show()

Παρατηρούμε πως και για τα 3 δίκτυα, όσο μεγαλύτερο είναι το πλήθος των επιπέδων που εκπαιδεύουμε τόσο καλύτερη είναι η επίδοση τους στα test δεδομένα. Συγκεκριμένα, παρατηρούμε πως όταν κάνουμε train μόνο την κεφαλή ταξινόμησης λαμβάνουμε τη χαμηλότερη ακρίβεια. Αυτή αυξάνεται μόλις εκπαιδεύουμε και ορισμένα συνελικτικά επίπεδα (Convolutional layers) που βρίσκονται προς την έξοδο του δικτύου. Ωστόσο, η πιο επιτυχημένη κατηγοριοποίηση προκύπτει για εκπαίδευση πάνω στο σύνολο όλων των επιπέδων κάθε μοντέλου. Για το λόγο αυτό, στην επόμενη ενότητα ασχολούμαστε μόνο με τη βελτιστοποίηση των δικτύων μεταφοράς μάθησης που έχουν trainable όλα τους τα layers.

Βελτιστοποίηση δικτύων

Όπως είδαμε στην εκπαίδευση των μέχρι τώρα δικτύων, είναι αρκετά έντονο το φαινόμενο του overfitting, με τα δίκτυα να μαθαίνουν άψογα τα δεδομένα εκπαίδευσης αλλά να αδυνατούν να γενικεύσουν επιτυχημένα το γενικότερο πρόβλημα κατηγοριοποίησης. Για το σκοπό αυτό, στην ενότητα αυτή, εφαρμόζουμε διάφορες τεχνικές βελτιστοποίησης με στόχο την αύξηση της επίδοσης των μοντέλων μας και την αντιμετώπιση της υπερεκπαίδευσης. Συγκεκριμένα χρησιμοποιούμε τις ακόλουθες τεχνικές:

  • Early Stopping: Aποτελεί μια τεχνική αντιμετώπισης του overfitting, κατά την οποία η εκπαίδευση του δικτύου διακόπτεται πρόωρα αν δεν εμφανίζεται βελτίωση ως προς κάποια μετρική απόδοσης που παρακολουθούμε (συνήθως αυτή είναι το validation loss). Θεωρούμε μια παράμετρο ανοχής patience, η οποία καθορίζει το πόσο ανεκτικοί είμαστε ως προς την επιδείνωση του loss. Συγκεκριμένα, αν αυτό δεν βελτιωθεί μετά από patience το πλήθος εποχές, τότε διακόπτουμε την εκπαίδευση του δικτύου. Επιλέγουμε να θέσουμε το patience ίσο με 20 ενώ θέτουμε και την παράμετρο restore_best_weights σε True, ώστε να αποθηκευτεί εν τέλει το μοντέλο που κατά την διαδικασία της εκπαίδευσης έδωσε το μικρότερο validation loss:

    callback = tf.keras.callbacks.EarlyStopping(monitor='val_loss', patience=20, restore_best_weights=True)
    
  • Dropout: Είναι μια μορφή κανονικοποίησης που αναγκάζει τα βάρη στο δίκτυο να λαμβάνουν μόνο μικρές τιμές, γεγονός που καθιστά την κατανομή των τιμών βάρους κανονική και το δίκτυο μπορεί να μειώσει την υπερεκπαίδευση σε μικρά δείγματα εκπαίδευσης. Όταν εφαρμόσουμε το dropout σε ένα επίπεδο, πετάμε τυχαία (θέτοντας τα μηδέν) ένα πλήθος μονάδων εξόδου από το επίπεδο που το εφαρμόζουμε, κατά τη διάρκεια της διαδικασίας εκπαίδευσης. Το dropout παίρνει έναν κλασματικό αριθμό ως τιμή εισόδου του, όπως 0.1, 0.2, 0.4, κλπ. Αυτό ισοδυναμεί με κατάργηση 10\%, 20\% ή 40\% των μονάδων εξόδου τυχαία από το εφαρμοζόμενο επίπεδο.

  • L2 Regularization: Η τεχνική αυτή προσθέτει στη loss function ένα penalty term που ισούται με τo τετράγωνο της L2 νόρμας του διανύσματος βαρών. Το μέγεθος της κανονικοποίησης ρυθμίζεται από την παράμετρο λ. Αν αυτή είναι μηδενική, τότε δεν υφίσταται καθόλου regularization και η loss function αποτελείται μόνο από το σφάλμα μεταξύ της εξόδου $y$ και της πρόβλεψης $\hat{y}$. Για πολύ μεγάλες τιμές του λ, προστίθεται μεγάλο επιπλέον βάρος και αυτό οδηγεί το μοντέλο σε underfitting. Φαίνεται λοιπόν πως η κατάλληλη επιλογή του λ είναι ιδιαίτερα σημαντική. Μετά από δοκιμές καταλήγουμε στην τιμή 0.001.

l2-regularization
  • Batch Normalization: Χρησιμοποιείται ως τεχνική βελτίωσης της ταχύτητας εκπαίδευσης, της σταθερότητας αλλά και της επίδοσης στα νευρωνικά δίκτυα. Ουσιαστικά χρησιμοποιείται ως ένα μέσο κανονικοποίησης του επιπέδου εισόδου, πραγματοποιώντας κατάλληλο scaling των activations. Η χρησιμότητα αυτού του layer είναι αδιαμφισβήτητη, καθώς χρησιμοποιείται σε πολυάριθμες εφαρμογές τα τελευταία χρόνια. Παρόλα αυτά, ο λόγος στον οποίο έγκειται αυτή η αποτελεσματικότητα δεν έχει πλήρως εξακριβωθεί. Η πιο πιθανή αιτία, φαίνεται πως έχει να κάνει με το πρόβλημα του internal covariate shift, που επηρεάζει το learning rate του νευρωνικού λόγω της αρχικοποίησης των παραμέτρων. Η χρήση του batch normalization δείχνει να αμβλύνει το πρόβλημα αυτό.

  • Data augmentation: Η υπερεκπαίδευση συμβαίνει γενικά όταν υπάρχει μικρός αριθμός παραδειγμάτων εκπαίδευσης. Ένας τρόπος για να διορθώσουμε αυτό το πρόβλημα είναι να αυξήσουμε το σύνολο δεδομένων εκπαίδευσης, χρησιμοποιώντας τυχαίους μετασχηματισμούς (περιστροφές, μετατοπίσεις κ.τ.λ.) των αρχικών εικόνων. Ο στόχος είναι ότι κατά τη διάρκεια της εκπαίδευσης, το μοντέλο να μην έχει δει ποτέ την ίδια εικόνα. Αυτό βοηθά στην έκθεση του μοντέλου σε περισσότερες εκδόσεις των δεδομένων ώστε να γενικεύει καλύτερα. Χρησιμοποιώντας το "ImageDataGenerator" του "tf.keras", δοκιμάζουμε διαφορετικούς μετασχηματισμούς στο σύνολο δεδομένων εκπαίδευσης και τα "νέα" δεδομένα χρησιμοποιούνται κατά τη διάρκεια της εκπαιδευτικής διαδικασίας. Ωστόσο, η διαδικασία αυτή φάινεται πως δεν βελτιώνει την ακρίβεια των μοντέλων μας, για αυτό και γίνεται χρήση μόνο των τεσσάρων προαναφερθεισών τεχνικών.

In [ ]:
# Data augmentation
from keras.preprocessing.image import ImageDataGenerator

image_gen_train = ImageDataGenerator(
                            rotation_range=90,
                            width_shift_range=0.1, 
                            height_shift_range=0.1,
                            horizontal_flip=True
                    )

# Data augmentation
train_ds = image_gen_train.flow(x=x_train,
                                y=y_train,
                                batch_size=BATCH_SIZE,
                                shuffle=True)

Δίκτυα "from scratch"

Ορίζουμε τώρα τα λεξικά losses_opt και accuracies_opt τα οποία έχουν για κλειδιά τα ονόματα των μοντέλων που εξετάζουμε και για τιμές τα βελτιστοποιημένα losses και accuracies αντίστοιχα. Ο λόγος για τον οποίο αποθηκεύουμε τις τιμές αυτές είναι ούτως ώστε να τις συγκρίνουμε με τις μη-βελτιστοποιημένες και να δούμε κατά πόσο βελτιώθηκαν τα μοντέλα μας. Επιπλέον ορίζουμε το callback το οποίο δίνουμε σαν παράμετρο στην train_model (και συγκεκριμένα στην model.fit) προκειμένου να υλοποιήσουμε το EarlyStopping που αναφέραμε προηγουμένως.

In [ ]:
losses_opt = {}
accuracies_opt = {}
callback = tf.keras.callbacks.EarlyStopping(monitor='val_loss', patience=20, restore_best_weights=True)

Simple CNN

In [ ]:
# a simple CNN https://www.tensorflow.org/tutorials/images/cnn

def init_simple_model_optimized(summary, optimizer = tf.optimizers.Adam, lr = 0.00005, classes_num = 20):

  model = models.Sequential()

  model.add(layers.Conv2D(32, (3, 3), kernel_regularizer=l2(0.01), input_shape=(32, 32, 3)))
  model.add(layers.BatchNormalization())
  model.add(layers.ReLU())
  model.add(layers.MaxPooling2D((2, 2)))
  model.add(layers.Dropout(0.2))

  model.add(layers.Conv2D(64, (3, 3), kernel_regularizer=l2(0.01)))
  model.add(layers.BatchNormalization())
  model.add(layers.ReLU())
  model.add(layers.MaxPooling2D((2, 2)))
  model.add(layers.Dropout(0.2))

  model.add(layers.Conv2D(64, (3, 3), kernel_regularizer=l2(0.01)))
  model.add(layers.BatchNormalization())
  model.add(layers.ReLU())
  
  model.add(layers.Flatten())
  model.add(layers.Dropout(0.3))
  model.add(layers.Dense(64, activation='relu'))
  model.add(layers.Dense(classes_num, activation='softmax'))
  
  model.compile(optimizer=optimizer(learning_rate = lr), loss=tf.keras.losses.sparse_categorical_crossentropy, metrics=["accuracy"])
  if summary: 
    model.summary()
  return model
In [ ]:
SIMPLE_MODEL_OPTIMIZED = init_simple_model_optimized(summary = True)
SIMPLE_MODEL_OPTIMIZED_history = train_model(SIMPLE_MODEL_OPTIMIZED, epochs = 200, callbacks=[callback])
Model: "sequential_1"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
conv2d_3 (Conv2D)            (None, 30, 30, 32)        896       
_________________________________________________________________
batch_normalization_3 (Batch (None, 30, 30, 32)        128       
_________________________________________________________________
re_lu_3 (ReLU)               (None, 30, 30, 32)        0         
_________________________________________________________________
max_pooling2d_2 (MaxPooling2 (None, 15, 15, 32)        0         
_________________________________________________________________
dropout_3 (Dropout)          (None, 15, 15, 32)        0         
_________________________________________________________________
conv2d_4 (Conv2D)            (None, 13, 13, 64)        18496     
_________________________________________________________________
batch_normalization_4 (Batch (None, 13, 13, 64)        256       
_________________________________________________________________
re_lu_4 (ReLU)               (None, 13, 13, 64)        0         
_________________________________________________________________
max_pooling2d_3 (MaxPooling2 (None, 6, 6, 64)          0         
_________________________________________________________________
dropout_4 (Dropout)          (None, 6, 6, 64)          0         
_________________________________________________________________
conv2d_5 (Conv2D)            (None, 4, 4, 64)          36928     
_________________________________________________________________
batch_normalization_5 (Batch (None, 4, 4, 64)          256       
_________________________________________________________________
re_lu_5 (ReLU)               (None, 4, 4, 64)          0         
_________________________________________________________________
flatten_1 (Flatten)          (None, 1024)              0         
_________________________________________________________________
dropout_5 (Dropout)          (None, 1024)              0         
_________________________________________________________________
dense_2 (Dense)              (None, 64)                65600     
_________________________________________________________________
dense_3 (Dense)              (None, 20)                1300      
=================================================================
Total params: 123,860
Trainable params: 123,540
Non-trainable params: 320
_________________________________________________________________
Epoch 1/200
266/266 [==============================] - 2s 5ms/step - loss: 4.2183 - accuracy: 0.0835 - val_loss: 4.1852 - val_accuracy: 0.0525
Epoch 2/200
266/266 [==============================] - 1s 4ms/step - loss: 3.7806 - accuracy: 0.1706 - val_loss: 3.7235 - val_accuracy: 0.1762
Epoch 3/200
266/266 [==============================] - 1s 4ms/step - loss: 3.5521 - accuracy: 0.2167 - val_loss: 3.3614 - val_accuracy: 0.2666
Epoch 4/200
266/266 [==============================] - 1s 4ms/step - loss: 3.3918 - accuracy: 0.2571 - val_loss: 3.2176 - val_accuracy: 0.2812
Epoch 5/200
266/266 [==============================] - 1s 4ms/step - loss: 3.1948 - accuracy: 0.2944 - val_loss: 3.0450 - val_accuracy: 0.3318
Epoch 6/200
266/266 [==============================] - 1s 4ms/step - loss: 3.0347 - accuracy: 0.3333 - val_loss: 2.8702 - val_accuracy: 0.3684
Epoch 7/200
266/266 [==============================] - 1s 4ms/step - loss: 2.8779 - accuracy: 0.3549 - val_loss: 2.8145 - val_accuracy: 0.3617
Epoch 8/200
266/266 [==============================] - 1s 4ms/step - loss: 2.7349 - accuracy: 0.3870 - val_loss: 2.7104 - val_accuracy: 0.3936
Epoch 9/200
266/266 [==============================] - 1s 4ms/step - loss: 2.6213 - accuracy: 0.4122 - val_loss: 2.6863 - val_accuracy: 0.3876
Epoch 10/200
266/266 [==============================] - 1s 4ms/step - loss: 2.5294 - accuracy: 0.4319 - val_loss: 2.6369 - val_accuracy: 0.3949
Epoch 11/200
266/266 [==============================] - 1s 4ms/step - loss: 2.4336 - accuracy: 0.4409 - val_loss: 2.4224 - val_accuracy: 0.4435
Epoch 12/200
266/266 [==============================] - 1s 4ms/step - loss: 2.3366 - accuracy: 0.4627 - val_loss: 2.5184 - val_accuracy: 0.4269
Epoch 13/200
266/266 [==============================] - 1s 4ms/step - loss: 2.2931 - accuracy: 0.4794 - val_loss: 2.4087 - val_accuracy: 0.4515
Epoch 14/200
266/266 [==============================] - 1s 4ms/step - loss: 2.2257 - accuracy: 0.4847 - val_loss: 2.2584 - val_accuracy: 0.4934
Epoch 15/200
266/266 [==============================] - 1s 4ms/step - loss: 2.1476 - accuracy: 0.4926 - val_loss: 2.2486 - val_accuracy: 0.4781
Epoch 16/200
266/266 [==============================] - 1s 4ms/step - loss: 2.0583 - accuracy: 0.5205 - val_loss: 2.1153 - val_accuracy: 0.5173
Epoch 17/200
266/266 [==============================] - 1s 4ms/step - loss: 2.0525 - accuracy: 0.5192 - val_loss: 2.2500 - val_accuracy: 0.4641
Epoch 18/200
266/266 [==============================] - 1s 4ms/step - loss: 1.9686 - accuracy: 0.5265 - val_loss: 2.4673 - val_accuracy: 0.4182
Epoch 19/200
266/266 [==============================] - 1s 4ms/step - loss: 1.9476 - accuracy: 0.5417 - val_loss: 2.0522 - val_accuracy: 0.5226
Epoch 20/200
266/266 [==============================] - 1s 4ms/step - loss: 1.9089 - accuracy: 0.5458 - val_loss: 2.0014 - val_accuracy: 0.5253
Epoch 21/200
266/266 [==============================] - 1s 4ms/step - loss: 1.8663 - accuracy: 0.5513 - val_loss: 2.0436 - val_accuracy: 0.5047
Epoch 22/200
266/266 [==============================] - 1s 4ms/step - loss: 1.8095 - accuracy: 0.5621 - val_loss: 2.3075 - val_accuracy: 0.4422
Epoch 23/200
266/266 [==============================] - 1s 4ms/step - loss: 1.7409 - accuracy: 0.5828 - val_loss: 2.5362 - val_accuracy: 0.4096
Epoch 24/200
266/266 [==============================] - 1s 4ms/step - loss: 1.7643 - accuracy: 0.5699 - val_loss: 1.8757 - val_accuracy: 0.5399
Epoch 25/200
266/266 [==============================] - 1s 4ms/step - loss: 1.6975 - accuracy: 0.5801 - val_loss: 1.9877 - val_accuracy: 0.5206
Epoch 26/200
266/266 [==============================] - 1s 4ms/step - loss: 1.6692 - accuracy: 0.5950 - val_loss: 2.1042 - val_accuracy: 0.4794
Epoch 27/200
266/266 [==============================] - 1s 4ms/step - loss: 1.6303 - accuracy: 0.6033 - val_loss: 1.9823 - val_accuracy: 0.5186
Epoch 28/200
266/266 [==============================] - 1s 4ms/step - loss: 1.5824 - accuracy: 0.6124 - val_loss: 1.8892 - val_accuracy: 0.5479
Epoch 29/200
266/266 [==============================] - 1s 4ms/step - loss: 1.5798 - accuracy: 0.5983 - val_loss: 1.8005 - val_accuracy: 0.5645
Epoch 30/200
266/266 [==============================] - 1s 4ms/step - loss: 1.5214 - accuracy: 0.6197 - val_loss: 1.8047 - val_accuracy: 0.5685
Epoch 31/200
266/266 [==============================] - 1s 4ms/step - loss: 1.5256 - accuracy: 0.6140 - val_loss: 1.7411 - val_accuracy: 0.5691
Epoch 32/200
266/266 [==============================] - 1s 4ms/step - loss: 1.4746 - accuracy: 0.6316 - val_loss: 1.6714 - val_accuracy: 0.5924
Epoch 33/200
266/266 [==============================] - 1s 4ms/step - loss: 1.4569 - accuracy: 0.6378 - val_loss: 1.6725 - val_accuracy: 0.5884
Epoch 34/200
266/266 [==============================] - 1s 4ms/step - loss: 1.4173 - accuracy: 0.6429 - val_loss: 1.7502 - val_accuracy: 0.5665
Epoch 35/200
266/266 [==============================] - 1s 4ms/step - loss: 1.4010 - accuracy: 0.6439 - val_loss: 1.8007 - val_accuracy: 0.5519
Epoch 36/200
266/266 [==============================] - 1s 4ms/step - loss: 1.4238 - accuracy: 0.6373 - val_loss: 1.6941 - val_accuracy: 0.5811
Epoch 37/200
266/266 [==============================] - 1s 4ms/step - loss: 1.3806 - accuracy: 0.6462 - val_loss: 1.6793 - val_accuracy: 0.5898
Epoch 38/200
266/266 [==============================] - 1s 4ms/step - loss: 1.3425 - accuracy: 0.6642 - val_loss: 1.5611 - val_accuracy: 0.6197
Epoch 39/200
266/266 [==============================] - 1s 4ms/step - loss: 1.3132 - accuracy: 0.6701 - val_loss: 1.7080 - val_accuracy: 0.5598
Epoch 40/200
266/266 [==============================] - 1s 4ms/step - loss: 1.2745 - accuracy: 0.6774 - val_loss: 1.5964 - val_accuracy: 0.6124
Epoch 41/200
266/266 [==============================] - 1s 4ms/step - loss: 1.2890 - accuracy: 0.6773 - val_loss: 1.5872 - val_accuracy: 0.6097
Epoch 42/200
266/266 [==============================] - 1s 4ms/step - loss: 1.2777 - accuracy: 0.6770 - val_loss: 1.6297 - val_accuracy: 0.5805
Epoch 43/200
266/266 [==============================] - 1s 4ms/step - loss: 1.2402 - accuracy: 0.6856 - val_loss: 1.6205 - val_accuracy: 0.5957
Epoch 44/200
266/266 [==============================] - 1s 4ms/step - loss: 1.2309 - accuracy: 0.6862 - val_loss: 1.6043 - val_accuracy: 0.5977
Epoch 45/200
266/266 [==============================] - 1s 4ms/step - loss: 1.2101 - accuracy: 0.6918 - val_loss: 1.7713 - val_accuracy: 0.5539
Epoch 46/200
266/266 [==============================] - 1s 4ms/step - loss: 1.2079 - accuracy: 0.6880 - val_loss: 1.4786 - val_accuracy: 0.6509
Epoch 47/200
266/266 [==============================] - 1s 4ms/step - loss: 1.1938 - accuracy: 0.7028 - val_loss: 1.5206 - val_accuracy: 0.6237
Epoch 48/200
266/266 [==============================] - 1s 4ms/step - loss: 1.1622 - accuracy: 0.7011 - val_loss: 1.5465 - val_accuracy: 0.6117
Epoch 49/200
266/266 [==============================] - 1s 4ms/step - loss: 1.1663 - accuracy: 0.6977 - val_loss: 1.5157 - val_accuracy: 0.6130
Epoch 50/200
266/266 [==============================] - 1s 4ms/step - loss: 1.1587 - accuracy: 0.7052 - val_loss: 1.6190 - val_accuracy: 0.5918
Epoch 51/200
266/266 [==============================] - 1s 4ms/step - loss: 1.1069 - accuracy: 0.7173 - val_loss: 1.6166 - val_accuracy: 0.5991
Epoch 52/200
266/266 [==============================] - 1s 4ms/step - loss: 1.1146 - accuracy: 0.7154 - val_loss: 1.4890 - val_accuracy: 0.6257
Epoch 53/200
266/266 [==============================] - 1s 4ms/step - loss: 1.1071 - accuracy: 0.7109 - val_loss: 1.4453 - val_accuracy: 0.6383
Epoch 54/200
266/266 [==============================] - 1s 4ms/step - loss: 1.0780 - accuracy: 0.7217 - val_loss: 1.5325 - val_accuracy: 0.6124
Epoch 55/200
266/266 [==============================] - 1s 4ms/step - loss: 1.0842 - accuracy: 0.7221 - val_loss: 1.4442 - val_accuracy: 0.6469
Epoch 56/200
266/266 [==============================] - 1s 4ms/step - loss: 1.0637 - accuracy: 0.7172 - val_loss: 1.4306 - val_accuracy: 0.6376
Epoch 57/200
266/266 [==============================] - 1s 4ms/step - loss: 1.0471 - accuracy: 0.7232 - val_loss: 1.4638 - val_accuracy: 0.6263
Epoch 58/200
266/266 [==============================] - 1s 4ms/step - loss: 1.0427 - accuracy: 0.7302 - val_loss: 1.4463 - val_accuracy: 0.6503
Epoch 59/200
266/266 [==============================] - 1s 4ms/step - loss: 1.0343 - accuracy: 0.7280 - val_loss: 1.4432 - val_accuracy: 0.6430
Epoch 60/200
266/266 [==============================] - 1s 4ms/step - loss: 1.0172 - accuracy: 0.7310 - val_loss: 1.4543 - val_accuracy: 0.6297
Epoch 61/200
266/266 [==============================] - 1s 4ms/step - loss: 0.9758 - accuracy: 0.7501 - val_loss: 1.3583 - val_accuracy: 0.6576
Epoch 62/200
266/266 [==============================] - 1s 4ms/step - loss: 0.9779 - accuracy: 0.7437 - val_loss: 1.4086 - val_accuracy: 0.6436
Epoch 63/200
266/266 [==============================] - 1s 4ms/step - loss: 0.9789 - accuracy: 0.7402 - val_loss: 1.4480 - val_accuracy: 0.6323
Epoch 64/200
266/266 [==============================] - 1s 4ms/step - loss: 0.9563 - accuracy: 0.7526 - val_loss: 1.6335 - val_accuracy: 0.5858
Epoch 65/200
266/266 [==============================] - 1s 4ms/step - loss: 0.9572 - accuracy: 0.7575 - val_loss: 1.3850 - val_accuracy: 0.6383
Epoch 66/200
266/266 [==============================] - 1s 4ms/step - loss: 0.9334 - accuracy: 0.7649 - val_loss: 1.4343 - val_accuracy: 0.6396
Epoch 67/200
266/266 [==============================] - 1s 4ms/step - loss: 0.9389 - accuracy: 0.7477 - val_loss: 1.3513 - val_accuracy: 0.6689
Epoch 68/200
266/266 [==============================] - 1s 4ms/step - loss: 0.9386 - accuracy: 0.7570 - val_loss: 1.5275 - val_accuracy: 0.6157
Epoch 69/200
266/266 [==============================] - 1s 4ms/step - loss: 0.9032 - accuracy: 0.7644 - val_loss: 1.3634 - val_accuracy: 0.6562
Epoch 70/200
266/266 [==============================] - 1s 4ms/step - loss: 0.9136 - accuracy: 0.7559 - val_loss: 1.4356 - val_accuracy: 0.6370
Epoch 71/200
266/266 [==============================] - 1s 4ms/step - loss: 0.9113 - accuracy: 0.7657 - val_loss: 1.3721 - val_accuracy: 0.6622
Epoch 72/200
266/266 [==============================] - 1s 4ms/step - loss: 0.8995 - accuracy: 0.7556 - val_loss: 1.3725 - val_accuracy: 0.6489
Epoch 73/200
266/266 [==============================] - 1s 4ms/step - loss: 0.8887 - accuracy: 0.7667 - val_loss: 1.4285 - val_accuracy: 0.6376
Epoch 74/200
266/266 [==============================] - 1s 4ms/step - loss: 0.8541 - accuracy: 0.7772 - val_loss: 1.3344 - val_accuracy: 0.6622
Epoch 75/200
266/266 [==============================] - 1s 4ms/step - loss: 0.8611 - accuracy: 0.7800 - val_loss: 1.3413 - val_accuracy: 0.6609
Epoch 76/200
266/266 [==============================] - 1s 4ms/step - loss: 0.8776 - accuracy: 0.7733 - val_loss: 1.3632 - val_accuracy: 0.6536
Epoch 77/200
266/266 [==============================] - 1s 4ms/step - loss: 0.8610 - accuracy: 0.7654 - val_loss: 1.3700 - val_accuracy: 0.6582
Epoch 78/200
266/266 [==============================] - 1s 4ms/step - loss: 0.8726 - accuracy: 0.7686 - val_loss: 1.3272 - val_accuracy: 0.6536
Epoch 79/200
266/266 [==============================] - 1s 4ms/step - loss: 0.8789 - accuracy: 0.7721 - val_loss: 1.3810 - val_accuracy: 0.6463
Epoch 80/200
266/266 [==============================] - 1s 4ms/step - loss: 0.8280 - accuracy: 0.7854 - val_loss: 1.4521 - val_accuracy: 0.6310
Epoch 81/200
266/266 [==============================] - 1s 4ms/step - loss: 0.8329 - accuracy: 0.7799 - val_loss: 1.3913 - val_accuracy: 0.6562
Epoch 82/200
266/266 [==============================] - 1s 4ms/step - loss: 0.8325 - accuracy: 0.7828 - val_loss: 1.3246 - val_accuracy: 0.6629
Epoch 83/200
266/266 [==============================] - 1s 4ms/step - loss: 0.8271 - accuracy: 0.7769 - val_loss: 1.5125 - val_accuracy: 0.6164
Epoch 84/200
266/266 [==============================] - 1s 4ms/step - loss: 0.8007 - accuracy: 0.7886 - val_loss: 1.4691 - val_accuracy: 0.6257
Epoch 85/200
266/266 [==============================] - 1s 4ms/step - loss: 0.8121 - accuracy: 0.7873 - val_loss: 1.3707 - val_accuracy: 0.6576
Epoch 86/200
266/266 [==============================] - 1s 4ms/step - loss: 0.7823 - accuracy: 0.7985 - val_loss: 1.3960 - val_accuracy: 0.6549
Epoch 87/200
266/266 [==============================] - 1s 4ms/step - loss: 0.8042 - accuracy: 0.7914 - val_loss: 1.3120 - val_accuracy: 0.6669
Epoch 88/200
266/266 [==============================] - 1s 4ms/step - loss: 0.8034 - accuracy: 0.7832 - val_loss: 1.3856 - val_accuracy: 0.6529
Epoch 89/200
266/266 [==============================] - 1s 4ms/step - loss: 0.7751 - accuracy: 0.7926 - val_loss: 1.3491 - val_accuracy: 0.6609
Epoch 90/200
266/266 [==============================] - 1s 4ms/step - loss: 0.7778 - accuracy: 0.7953 - val_loss: 1.4757 - val_accuracy: 0.6283
Epoch 91/200
266/266 [==============================] - 1s 4ms/step - loss: 0.7732 - accuracy: 0.7955 - val_loss: 1.3772 - val_accuracy: 0.6496
Epoch 92/200
266/266 [==============================] - 1s 4ms/step - loss: 0.7549 - accuracy: 0.7991 - val_loss: 1.3280 - val_accuracy: 0.6749
Epoch 93/200
266/266 [==============================] - 1s 4ms/step - loss: 0.7660 - accuracy: 0.7876 - val_loss: 1.2930 - val_accuracy: 0.6729
Epoch 94/200
266/266 [==============================] - 1s 4ms/step - loss: 0.7321 - accuracy: 0.8028 - val_loss: 1.3517 - val_accuracy: 0.6589
Epoch 95/200
266/266 [==============================] - 1s 4ms/step - loss: 0.7343 - accuracy: 0.8024 - val_loss: 1.3070 - val_accuracy: 0.6662
Epoch 96/200
266/266 [==============================] - 1s 4ms/step - loss: 0.7251 - accuracy: 0.8065 - val_loss: 1.3489 - val_accuracy: 0.6609
Epoch 97/200
266/266 [==============================] - 1s 4ms/step - loss: 0.7298 - accuracy: 0.8091 - val_loss: 1.2930 - val_accuracy: 0.6616
Epoch 98/200
266/266 [==============================] - 1s 4ms/step - loss: 0.7207 - accuracy: 0.8079 - val_loss: 1.4173 - val_accuracy: 0.6456
Epoch 99/200
266/266 [==============================] - 1s 4ms/step - loss: 0.7296 - accuracy: 0.8061 - val_loss: 1.3478 - val_accuracy: 0.6576
Epoch 100/200
266/266 [==============================] - 1s 4ms/step - loss: 0.7041 - accuracy: 0.8176 - val_loss: 1.3815 - val_accuracy: 0.6543
Epoch 101/200
266/266 [==============================] - 1s 4ms/step - loss: 0.6929 - accuracy: 0.8166 - val_loss: 1.3328 - val_accuracy: 0.6622
Epoch 102/200
266/266 [==============================] - 1s 5ms/step - loss: 0.7024 - accuracy: 0.8153 - val_loss: 1.5208 - val_accuracy: 0.6277
Epoch 103/200
266/266 [==============================] - 1s 4ms/step - loss: 0.6801 - accuracy: 0.8230 - val_loss: 1.3682 - val_accuracy: 0.6489
Epoch 104/200
266/266 [==============================] - 1s 4ms/step - loss: 0.7073 - accuracy: 0.8134 - val_loss: 1.3315 - val_accuracy: 0.6669
Epoch 105/200
266/266 [==============================] - 1s 4ms/step - loss: 0.6817 - accuracy: 0.8214 - val_loss: 1.3746 - val_accuracy: 0.6476
Epoch 106/200
266/266 [==============================] - 1s 4ms/step - loss: 0.6750 - accuracy: 0.8200 - val_loss: 1.3759 - val_accuracy: 0.6602
Epoch 107/200
266/266 [==============================] - 1s 4ms/step - loss: 0.6941 - accuracy: 0.8186 - val_loss: 1.3733 - val_accuracy: 0.6516
Epoch 108/200
266/266 [==============================] - 1s 4ms/step - loss: 0.6685 - accuracy: 0.8282 - val_loss: 1.2898 - val_accuracy: 0.6609
Epoch 109/200
266/266 [==============================] - 1s 4ms/step - loss: 0.6504 - accuracy: 0.8367 - val_loss: 1.3456 - val_accuracy: 0.6543
Epoch 110/200
266/266 [==============================] - 1s 4ms/step - loss: 0.6504 - accuracy: 0.8310 - val_loss: 1.4253 - val_accuracy: 0.6390
Epoch 111/200
266/266 [==============================] - 1s 4ms/step - loss: 0.6864 - accuracy: 0.8160 - val_loss: 1.3259 - val_accuracy: 0.6789
Epoch 112/200
266/266 [==============================] - 1s 4ms/step - loss: 0.6816 - accuracy: 0.8161 - val_loss: 1.2850 - val_accuracy: 0.6722
Epoch 113/200
266/266 [==============================] - 1s 4ms/step - loss: 0.6621 - accuracy: 0.8231 - val_loss: 1.5894 - val_accuracy: 0.6137
Epoch 114/200
266/266 [==============================] - 1s 4ms/step - loss: 0.6583 - accuracy: 0.8258 - val_loss: 1.3503 - val_accuracy: 0.6602
Epoch 115/200
266/266 [==============================] - 1s 4ms/step - loss: 0.6462 - accuracy: 0.8309 - val_loss: 1.3155 - val_accuracy: 0.6735
Epoch 116/200
266/266 [==============================] - 1s 4ms/step - loss: 0.6479 - accuracy: 0.8282 - val_loss: 1.7024 - val_accuracy: 0.6044
Epoch 117/200
266/266 [==============================] - 1s 4ms/step - loss: 0.6572 - accuracy: 0.8299 - val_loss: 1.3006 - val_accuracy: 0.6656
Epoch 118/200
266/266 [==============================] - 1s 4ms/step - loss: 0.6463 - accuracy: 0.8263 - val_loss: 1.3451 - val_accuracy: 0.6636
Epoch 119/200
266/266 [==============================] - 1s 4ms/step - loss: 0.6717 - accuracy: 0.8185 - val_loss: 1.4459 - val_accuracy: 0.6529
Epoch 120/200
266/266 [==============================] - 1s 4ms/step - loss: 0.6271 - accuracy: 0.8321 - val_loss: 1.4151 - val_accuracy: 0.6529
Epoch 121/200
266/266 [==============================] - 1s 4ms/step - loss: 0.6441 - accuracy: 0.8271 - val_loss: 1.4010 - val_accuracy: 0.6543
Epoch 122/200
266/266 [==============================] - 1s 4ms/step - loss: 0.6234 - accuracy: 0.8367 - val_loss: 1.3521 - val_accuracy: 0.6602
Epoch 123/200
266/266 [==============================] - 1s 4ms/step - loss: 0.6288 - accuracy: 0.8304 - val_loss: 1.5633 - val_accuracy: 0.6243
Epoch 124/200
266/266 [==============================] - 1s 4ms/step - loss: 0.6285 - accuracy: 0.8314 - val_loss: 1.4140 - val_accuracy: 0.6496
Epoch 125/200
266/266 [==============================] - 1s 4ms/step - loss: 0.5924 - accuracy: 0.8491 - val_loss: 1.3630 - val_accuracy: 0.6662
Epoch 126/200
266/266 [==============================] - 1s 4ms/step - loss: 0.6096 - accuracy: 0.8452 - val_loss: 1.2631 - val_accuracy: 0.6828
Epoch 127/200
266/266 [==============================] - 1s 4ms/step - loss: 0.6212 - accuracy: 0.8365 - val_loss: 1.3414 - val_accuracy: 0.6782
Epoch 128/200
266/266 [==============================] - 1s 4ms/step - loss: 0.6118 - accuracy: 0.8364 - val_loss: 1.3744 - val_accuracy: 0.6622
Epoch 129/200
266/266 [==============================] - 1s 4ms/step - loss: 0.6101 - accuracy: 0.8474 - val_loss: 1.3188 - val_accuracy: 0.6722
Epoch 130/200
266/266 [==============================] - 1s 4ms/step - loss: 0.6099 - accuracy: 0.8434 - val_loss: 1.2803 - val_accuracy: 0.6789
Epoch 131/200
266/266 [==============================] - 1s 4ms/step - loss: 0.5969 - accuracy: 0.8402 - val_loss: 1.3577 - val_accuracy: 0.6682
Epoch 132/200
266/266 [==============================] - 1s 4ms/step - loss: 0.6034 - accuracy: 0.8425 - val_loss: 1.4090 - val_accuracy: 0.6483
Epoch 133/200
266/266 [==============================] - 1s 4ms/step - loss: 0.5953 - accuracy: 0.8451 - val_loss: 1.3223 - val_accuracy: 0.6775
Epoch 134/200
266/266 [==============================] - 1s 4ms/step - loss: 0.5940 - accuracy: 0.8468 - val_loss: 1.3044 - val_accuracy: 0.6735
Epoch 135/200
266/266 [==============================] - 1s 4ms/step - loss: 0.6010 - accuracy: 0.8410 - val_loss: 1.3067 - val_accuracy: 0.6735
Epoch 136/200
266/266 [==============================] - 1s 4ms/step - loss: 0.6016 - accuracy: 0.8371 - val_loss: 1.4054 - val_accuracy: 0.6609
Epoch 137/200
266/266 [==============================] - 1s 4ms/step - loss: 0.6068 - accuracy: 0.8385 - val_loss: 1.3192 - val_accuracy: 0.6762
Epoch 138/200
266/266 [==============================] - 1s 4ms/step - loss: 0.5772 - accuracy: 0.8541 - val_loss: 1.3476 - val_accuracy: 0.6795
Epoch 139/200
266/266 [==============================] - 1s 4ms/step - loss: 0.5968 - accuracy: 0.8457 - val_loss: 1.2910 - val_accuracy: 0.6802
Epoch 140/200
266/266 [==============================] - 1s 4ms/step - loss: 0.5734 - accuracy: 0.8507 - val_loss: 1.2984 - val_accuracy: 0.6809
Epoch 141/200
266/266 [==============================] - 1s 4ms/step - loss: 0.5800 - accuracy: 0.8487 - val_loss: 1.5109 - val_accuracy: 0.6403
Epoch 142/200
266/266 [==============================] - 1s 4ms/step - loss: 0.5780 - accuracy: 0.8526 - val_loss: 1.2712 - val_accuracy: 0.6928
Epoch 143/200
266/266 [==============================] - 1s 4ms/step - loss: 0.5785 - accuracy: 0.8539 - val_loss: 1.4346 - val_accuracy: 0.6503
Epoch 144/200
266/266 [==============================] - 1s 4ms/step - loss: 0.5598 - accuracy: 0.8538 - val_loss: 1.3658 - val_accuracy: 0.6729
Epoch 145/200
266/266 [==============================] - 1s 4ms/step - loss: 0.5503 - accuracy: 0.8624 - val_loss: 1.3044 - val_accuracy: 0.6828
Epoch 146/200
266/266 [==============================] - 1s 4ms/step - loss: 0.5593 - accuracy: 0.8530 - val_loss: 1.3754 - val_accuracy: 0.6689
In [ ]:
loss, accuracy = model_report(SIMPLE_MODEL_OPTIMIZED, SIMPLE_MODEL_OPTIMIZED_history)
losses_opt["SIMPLE_MODEL"] = loss
accuracies_opt["SIMPLE_MODEL"] = accuracy
Test set evaluation metrics
---------------------------
Loss:     1.265
Accuracy: 68.254%

CNN1

In [ ]:
def init_cnn1_model_optimized(summary, optimizer = tf.optimizers.Adam, lr = 0.00005, classes_num = 20):
  model = models.Sequential()

  model.add(layers.Conv2D(32, (3, 3), kernel_regularizer=l2(0.01), input_shape=(32, 32, 3))) 
  model.add(layers.BatchNormalization())
  model.add(layers.ReLU())
  model.add(layers.MaxPooling2D((2, 2)))
  model.add(layers.Dropout(0.2))

  model.add(layers.Conv2D(64, (3, 3), kernel_regularizer=l2(0.01)))
  model.add(layers.BatchNormalization())
  model.add(layers.ReLU())
  model.add(layers.MaxPooling2D((2, 2)))
  model.add(layers.Dropout(0.2))

  model.add(layers.Conv2D(128, (3, 3), kernel_regularizer=l2(0.01)))
  model.add(layers.BatchNormalization())
  model.add(layers.ReLU())
  model.add(layers.AveragePooling2D((2, 2)))
  model.add(layers.Dropout(0.2))
  
  model.add(layers.Flatten())
  model.add(layers.Dense(1024,activation='relu'))
  model.add(layers.Dropout(0.4))
  model.add(layers.Dense(classes_num,activation='softmax'))

  model.compile(optimizer=optimizer(learning_rate = lr), loss=tf.keras.losses.sparse_categorical_crossentropy, metrics=["accuracy"])
  if summary: 
    model.summary()
  return model
In [ ]:
CNN1_MODEL_OPTIMIZED = init_cnn1_model_optimized(summary = True)
CNN1_MODEL_OPTIMIZED_history = train_model(CNN1_MODEL_OPTIMIZED, epochs = 200, callbacks=[callback])
Model: "sequential_2"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
conv2d_6 (Conv2D)            (None, 30, 30, 32)        896       
_________________________________________________________________
batch_normalization_6 (Batch (None, 30, 30, 32)        128       
_________________________________________________________________
re_lu_6 (ReLU)               (None, 30, 30, 32)        0         
_________________________________________________________________
max_pooling2d_4 (MaxPooling2 (None, 15, 15, 32)        0         
_________________________________________________________________
dropout_6 (Dropout)          (None, 15, 15, 32)        0         
_________________________________________________________________
conv2d_7 (Conv2D)            (None, 13, 13, 64)        18496     
_________________________________________________________________
batch_normalization_7 (Batch (None, 13, 13, 64)        256       
_________________________________________________________________
re_lu_7 (ReLU)               (None, 13, 13, 64)        0         
_________________________________________________________________
max_pooling2d_5 (MaxPooling2 (None, 6, 6, 64)          0         
_________________________________________________________________
dropout_7 (Dropout)          (None, 6, 6, 64)          0         
_________________________________________________________________
conv2d_8 (Conv2D)            (None, 4, 4, 128)         73856     
_________________________________________________________________
batch_normalization_8 (Batch (None, 4, 4, 128)         512       
_________________________________________________________________
re_lu_8 (ReLU)               (None, 4, 4, 128)         0         
_________________________________________________________________
average_pooling2d (AveragePo (None, 2, 2, 128)         0         
_________________________________________________________________
dropout_8 (Dropout)          (None, 2, 2, 128)         0         
_________________________________________________________________
flatten_2 (Flatten)          (None, 512)               0         
_________________________________________________________________
dense_4 (Dense)              (None, 1024)              525312    
_________________________________________________________________
dropout_9 (Dropout)          (None, 1024)              0         
_________________________________________________________________
dense_5 (Dense)              (None, 20)                20500     
=================================================================
Total params: 639,956
Trainable params: 639,508
Non-trainable params: 448
_________________________________________________________________
Epoch 1/200
266/266 [==============================] - 2s 5ms/step - loss: 4.1714 - accuracy: 0.1252 - val_loss: 4.4712 - val_accuracy: 0.0851
Epoch 2/200
266/266 [==============================] - 1s 4ms/step - loss: 3.5904 - accuracy: 0.2701 - val_loss: 3.6754 - val_accuracy: 0.2055
Epoch 3/200
266/266 [==============================] - 1s 4ms/step - loss: 3.2259 - accuracy: 0.3503 - val_loss: 3.0407 - val_accuracy: 0.3783
Epoch 4/200
266/266 [==============================] - 1s 4ms/step - loss: 2.9876 - accuracy: 0.3841 - val_loss: 2.8617 - val_accuracy: 0.4149
Epoch 5/200
266/266 [==============================] - 1s 4ms/step - loss: 2.8263 - accuracy: 0.4034 - val_loss: 2.7036 - val_accuracy: 0.4249
Epoch 6/200
266/266 [==============================] - 1s 4ms/step - loss: 2.6491 - accuracy: 0.4376 - val_loss: 2.7804 - val_accuracy: 0.3956
Epoch 7/200
266/266 [==============================] - 1s 4ms/step - loss: 2.5272 - accuracy: 0.4583 - val_loss: 2.7079 - val_accuracy: 0.4202
Epoch 8/200
266/266 [==============================] - 1s 4ms/step - loss: 2.3907 - accuracy: 0.4743 - val_loss: 2.4792 - val_accuracy: 0.4581
Epoch 9/200
266/266 [==============================] - 1s 4ms/step - loss: 2.2918 - accuracy: 0.5004 - val_loss: 2.5864 - val_accuracy: 0.4328
Epoch 10/200
266/266 [==============================] - 1s 4ms/step - loss: 2.1987 - accuracy: 0.5187 - val_loss: 2.4384 - val_accuracy: 0.4541
Epoch 11/200
266/266 [==============================] - 1s 4ms/step - loss: 2.1158 - accuracy: 0.5359 - val_loss: 2.3965 - val_accuracy: 0.4661
Epoch 12/200
266/266 [==============================] - 1s 4ms/step - loss: 2.0301 - accuracy: 0.5449 - val_loss: 2.2770 - val_accuracy: 0.4874
Epoch 13/200
266/266 [==============================] - 1s 4ms/step - loss: 1.9807 - accuracy: 0.5396 - val_loss: 2.3839 - val_accuracy: 0.4634
Epoch 14/200
266/266 [==============================] - 1s 4ms/step - loss: 1.9228 - accuracy: 0.5589 - val_loss: 2.0137 - val_accuracy: 0.5366
Epoch 15/200
266/266 [==============================] - 1s 4ms/step - loss: 1.8672 - accuracy: 0.5682 - val_loss: 2.0380 - val_accuracy: 0.5332
Epoch 16/200
266/266 [==============================] - 1s 4ms/step - loss: 1.8339 - accuracy: 0.5679 - val_loss: 1.9563 - val_accuracy: 0.5472
Epoch 17/200
266/266 [==============================] - 1s 4ms/step - loss: 1.7459 - accuracy: 0.5869 - val_loss: 2.0672 - val_accuracy: 0.5160
Epoch 18/200
266/266 [==============================] - 1s 4ms/step - loss: 1.6989 - accuracy: 0.5947 - val_loss: 2.0132 - val_accuracy: 0.5273
Epoch 19/200
266/266 [==============================] - 1s 4ms/step - loss: 1.6826 - accuracy: 0.5952 - val_loss: 2.0392 - val_accuracy: 0.5066
Epoch 20/200
266/266 [==============================] - 1s 4ms/step - loss: 1.6442 - accuracy: 0.6080 - val_loss: 2.0101 - val_accuracy: 0.5279
Epoch 21/200
266/266 [==============================] - 1s 4ms/step - loss: 1.5967 - accuracy: 0.6104 - val_loss: 1.8491 - val_accuracy: 0.5505
Epoch 22/200
266/266 [==============================] - 1s 4ms/step - loss: 1.5365 - accuracy: 0.6299 - val_loss: 1.8120 - val_accuracy: 0.5519
Epoch 23/200
266/266 [==============================] - 1s 4ms/step - loss: 1.5179 - accuracy: 0.6212 - val_loss: 1.7662 - val_accuracy: 0.5678
Epoch 24/200
266/266 [==============================] - 1s 4ms/step - loss: 1.4682 - accuracy: 0.6425 - val_loss: 1.7309 - val_accuracy: 0.5711
Epoch 25/200
266/266 [==============================] - 1s 4ms/step - loss: 1.4677 - accuracy: 0.6412 - val_loss: 1.5989 - val_accuracy: 0.6157
Epoch 26/200
266/266 [==============================] - 1s 4ms/step - loss: 1.3882 - accuracy: 0.6641 - val_loss: 1.7922 - val_accuracy: 0.5745
Epoch 27/200
266/266 [==============================] - 1s 4ms/step - loss: 1.3859 - accuracy: 0.6537 - val_loss: 1.6020 - val_accuracy: 0.6110
Epoch 28/200
266/266 [==============================] - 1s 4ms/step - loss: 1.3544 - accuracy: 0.6655 - val_loss: 1.6210 - val_accuracy: 0.6104
Epoch 29/200
266/266 [==============================] - 1s 4ms/step - loss: 1.3264 - accuracy: 0.6718 - val_loss: 1.7397 - val_accuracy: 0.5771
Epoch 30/200
266/266 [==============================] - 1s 4ms/step - loss: 1.3044 - accuracy: 0.6670 - val_loss: 1.5262 - val_accuracy: 0.6217
Epoch 31/200
266/266 [==============================] - 1s 4ms/step - loss: 1.2980 - accuracy: 0.6768 - val_loss: 1.7329 - val_accuracy: 0.5758
Epoch 32/200
266/266 [==============================] - 1s 4ms/step - loss: 1.2613 - accuracy: 0.6852 - val_loss: 1.6611 - val_accuracy: 0.5864
Epoch 33/200
266/266 [==============================] - 1s 4ms/step - loss: 1.2713 - accuracy: 0.6690 - val_loss: 1.4753 - val_accuracy: 0.6343
Epoch 34/200
266/266 [==============================] - 1s 4ms/step - loss: 1.2370 - accuracy: 0.6911 - val_loss: 1.5281 - val_accuracy: 0.6270
Epoch 35/200
266/266 [==============================] - 1s 4ms/step - loss: 1.2167 - accuracy: 0.6865 - val_loss: 1.4155 - val_accuracy: 0.6616
Epoch 36/200
266/266 [==============================] - 1s 4ms/step - loss: 1.1912 - accuracy: 0.6999 - val_loss: 1.5652 - val_accuracy: 0.6037
Epoch 37/200
266/266 [==============================] - 1s 4ms/step - loss: 1.1566 - accuracy: 0.7027 - val_loss: 1.4864 - val_accuracy: 0.6469
Epoch 38/200
266/266 [==============================] - 1s 4ms/step - loss: 1.1381 - accuracy: 0.7143 - val_loss: 1.6613 - val_accuracy: 0.5984
Epoch 39/200
266/266 [==============================] - 1s 4ms/step - loss: 1.1234 - accuracy: 0.7048 - val_loss: 1.6118 - val_accuracy: 0.6124
Epoch 40/200
266/266 [==============================] - 1s 4ms/step - loss: 1.1265 - accuracy: 0.7036 - val_loss: 1.3882 - val_accuracy: 0.6423
Epoch 41/200
266/266 [==============================] - 1s 4ms/step - loss: 1.0780 - accuracy: 0.7208 - val_loss: 1.4362 - val_accuracy: 0.6410
Epoch 42/200
266/266 [==============================] - 1s 4ms/step - loss: 1.0780 - accuracy: 0.7190 - val_loss: 1.5151 - val_accuracy: 0.6277
Epoch 43/200
266/266 [==============================] - 1s 4ms/step - loss: 1.0506 - accuracy: 0.7299 - val_loss: 1.3339 - val_accuracy: 0.6789
Epoch 44/200
266/266 [==============================] - 1s 4ms/step - loss: 1.0682 - accuracy: 0.7205 - val_loss: 1.5994 - val_accuracy: 0.6170
Epoch 45/200
266/266 [==============================] - 1s 4ms/step - loss: 1.0249 - accuracy: 0.7394 - val_loss: 1.4830 - val_accuracy: 0.6376
Epoch 46/200
266/266 [==============================] - 1s 4ms/step - loss: 1.0190 - accuracy: 0.7325 - val_loss: 1.4252 - val_accuracy: 0.6576
Epoch 47/200
266/266 [==============================] - 1s 4ms/step - loss: 1.0014 - accuracy: 0.7397 - val_loss: 1.3840 - val_accuracy: 0.6523
Epoch 48/200
266/266 [==============================] - 1s 4ms/step - loss: 0.9976 - accuracy: 0.7372 - val_loss: 1.3434 - val_accuracy: 0.6669
Epoch 49/200
266/266 [==============================] - 1s 4ms/step - loss: 0.9665 - accuracy: 0.7531 - val_loss: 1.2904 - val_accuracy: 0.6802
Epoch 50/200
266/266 [==============================] - 1s 4ms/step - loss: 0.9488 - accuracy: 0.7543 - val_loss: 1.3852 - val_accuracy: 0.6516
Epoch 51/200
266/266 [==============================] - 1s 4ms/step - loss: 0.9586 - accuracy: 0.7484 - val_loss: 1.4238 - val_accuracy: 0.6403
Epoch 52/200
266/266 [==============================] - 1s 4ms/step - loss: 0.9678 - accuracy: 0.7431 - val_loss: 1.3039 - val_accuracy: 0.6802
Epoch 53/200
266/266 [==============================] - 1s 4ms/step - loss: 0.9267 - accuracy: 0.7596 - val_loss: 1.3189 - val_accuracy: 0.6782
Epoch 54/200
266/266 [==============================] - 1s 4ms/step - loss: 0.9149 - accuracy: 0.7591 - val_loss: 1.4139 - val_accuracy: 0.6423
Epoch 55/200
266/266 [==============================] - 1s 4ms/step - loss: 0.9002 - accuracy: 0.7671 - val_loss: 1.2550 - val_accuracy: 0.6908
Epoch 56/200
266/266 [==============================] - 1s 4ms/step - loss: 0.8802 - accuracy: 0.7648 - val_loss: 1.3091 - val_accuracy: 0.6782
Epoch 57/200
266/266 [==============================] - 1s 4ms/step - loss: 0.8934 - accuracy: 0.7644 - val_loss: 1.3509 - val_accuracy: 0.6656
Epoch 58/200
266/266 [==============================] - 1s 4ms/step - loss: 0.8623 - accuracy: 0.7694 - val_loss: 1.2899 - val_accuracy: 0.6815
Epoch 59/200
266/266 [==============================] - 1s 4ms/step - loss: 0.8217 - accuracy: 0.7884 - val_loss: 1.3550 - val_accuracy: 0.6689
Epoch 60/200
266/266 [==============================] - 1s 4ms/step - loss: 0.8542 - accuracy: 0.7760 - val_loss: 1.3708 - val_accuracy: 0.6662
Epoch 61/200
266/266 [==============================] - 1s 4ms/step - loss: 0.8231 - accuracy: 0.7929 - val_loss: 1.5462 - val_accuracy: 0.6197
Epoch 62/200
266/266 [==============================] - 1s 4ms/step - loss: 0.8660 - accuracy: 0.7764 - val_loss: 1.2831 - val_accuracy: 0.6795
Epoch 63/200
266/266 [==============================] - 1s 4ms/step - loss: 0.8274 - accuracy: 0.7906 - val_loss: 1.2242 - val_accuracy: 0.6935
Epoch 64/200
266/266 [==============================] - 1s 4ms/step - loss: 0.8265 - accuracy: 0.7846 - val_loss: 1.4101 - val_accuracy: 0.6529
Epoch 65/200
266/266 [==============================] - 1s 4ms/step - loss: 0.7931 - accuracy: 0.7930 - val_loss: 1.2083 - val_accuracy: 0.7021
Epoch 66/200
266/266 [==============================] - 1s 4ms/step - loss: 0.7951 - accuracy: 0.7954 - val_loss: 1.2700 - val_accuracy: 0.6875
Epoch 67/200
266/266 [==============================] - 1s 4ms/step - loss: 0.7913 - accuracy: 0.7910 - val_loss: 1.2690 - val_accuracy: 0.6762
Epoch 68/200
266/266 [==============================] - 1s 4ms/step - loss: 0.7662 - accuracy: 0.8006 - val_loss: 1.3232 - val_accuracy: 0.6782
Epoch 69/200
266/266 [==============================] - 1s 4ms/step - loss: 0.7672 - accuracy: 0.8028 - val_loss: 1.2542 - val_accuracy: 0.6941
Epoch 70/200
266/266 [==============================] - 1s 4ms/step - loss: 0.7809 - accuracy: 0.7920 - val_loss: 1.2563 - val_accuracy: 0.6928
Epoch 71/200
266/266 [==============================] - 1s 4ms/step - loss: 0.7491 - accuracy: 0.8040 - val_loss: 1.2226 - val_accuracy: 0.6875
Epoch 72/200
266/266 [==============================] - 1s 4ms/step - loss: 0.7404 - accuracy: 0.8122 - val_loss: 1.3103 - val_accuracy: 0.6828
Epoch 73/200
266/266 [==============================] - 1s 4ms/step - loss: 0.7553 - accuracy: 0.8079 - val_loss: 1.3681 - val_accuracy: 0.6729
Epoch 74/200
266/266 [==============================] - 1s 4ms/step - loss: 0.7284 - accuracy: 0.8066 - val_loss: 1.3208 - val_accuracy: 0.6815
Epoch 75/200
266/266 [==============================] - 1s 4ms/step - loss: 0.7358 - accuracy: 0.8095 - val_loss: 1.2782 - val_accuracy: 0.6789
Epoch 76/200
266/266 [==============================] - 1s 4ms/step - loss: 0.7181 - accuracy: 0.8178 - val_loss: 1.1847 - val_accuracy: 0.7001
Epoch 77/200
266/266 [==============================] - 1s 4ms/step - loss: 0.6990 - accuracy: 0.8207 - val_loss: 1.2852 - val_accuracy: 0.6815
Epoch 78/200
266/266 [==============================] - 1s 4ms/step - loss: 0.7021 - accuracy: 0.8191 - val_loss: 1.3967 - val_accuracy: 0.6636
Epoch 79/200
266/266 [==============================] - 1s 4ms/step - loss: 0.6874 - accuracy: 0.8244 - val_loss: 1.2352 - val_accuracy: 0.6948
Epoch 80/200
266/266 [==============================] - 1s 4ms/step - loss: 0.7067 - accuracy: 0.8180 - val_loss: 1.3149 - val_accuracy: 0.6735
Epoch 81/200
266/266 [==============================] - 1s 4ms/step - loss: 0.6811 - accuracy: 0.8239 - val_loss: 1.2162 - val_accuracy: 0.7028
Epoch 82/200
266/266 [==============================] - 1s 4ms/step - loss: 0.6761 - accuracy: 0.8216 - val_loss: 1.3773 - val_accuracy: 0.6662
Epoch 83/200
266/266 [==============================] - 1s 4ms/step - loss: 0.6662 - accuracy: 0.8254 - val_loss: 1.2215 - val_accuracy: 0.7008
Epoch 84/200
266/266 [==============================] - 1s 4ms/step - loss: 0.6405 - accuracy: 0.8315 - val_loss: 1.3325 - val_accuracy: 0.6735
Epoch 85/200
266/266 [==============================] - 1s 4ms/step - loss: 0.6737 - accuracy: 0.8284 - val_loss: 1.3819 - val_accuracy: 0.6735
Epoch 86/200
266/266 [==============================] - 1s 4ms/step - loss: 0.6641 - accuracy: 0.8256 - val_loss: 1.2716 - val_accuracy: 0.6795
Epoch 87/200
266/266 [==============================] - 1s 4ms/step - loss: 0.6289 - accuracy: 0.8411 - val_loss: 1.3691 - val_accuracy: 0.6702
Epoch 88/200
266/266 [==============================] - 1s 4ms/step - loss: 0.6405 - accuracy: 0.8340 - val_loss: 1.2065 - val_accuracy: 0.6988
Epoch 89/200
266/266 [==============================] - 1s 4ms/step - loss: 0.6362 - accuracy: 0.8350 - val_loss: 1.2567 - val_accuracy: 0.6922
Epoch 90/200
266/266 [==============================] - 1s 4ms/step - loss: 0.6296 - accuracy: 0.8400 - val_loss: 1.1999 - val_accuracy: 0.7134
Epoch 91/200
266/266 [==============================] - 1s 4ms/step - loss: 0.6191 - accuracy: 0.8449 - val_loss: 1.1911 - val_accuracy: 0.7114
Epoch 92/200
266/266 [==============================] - 1s 4ms/step - loss: 0.6156 - accuracy: 0.8378 - val_loss: 1.3433 - val_accuracy: 0.6815
Epoch 93/200
266/266 [==============================] - 1s 4ms/step - loss: 0.6111 - accuracy: 0.8477 - val_loss: 1.2148 - val_accuracy: 0.7001
Epoch 94/200
266/266 [==============================] - 1s 4ms/step - loss: 0.6177 - accuracy: 0.8413 - val_loss: 1.1816 - val_accuracy: 0.7088
Epoch 95/200
266/266 [==============================] - 1s 4ms/step - loss: 0.6072 - accuracy: 0.8455 - val_loss: 1.3286 - val_accuracy: 0.6742
Epoch 96/200
266/266 [==============================] - 1s 4ms/step - loss: 0.6246 - accuracy: 0.8446 - val_loss: 1.1690 - val_accuracy: 0.7081
Epoch 97/200
266/266 [==============================] - 1s 4ms/step - loss: 0.5994 - accuracy: 0.8462 - val_loss: 1.3282 - val_accuracy: 0.6928
Epoch 98/200
266/266 [==============================] - 1s 4ms/step - loss: 0.6212 - accuracy: 0.8463 - val_loss: 1.2646 - val_accuracy: 0.6961
Epoch 99/200
266/266 [==============================] - 1s 4ms/step - loss: 0.5948 - accuracy: 0.8494 - val_loss: 1.2634 - val_accuracy: 0.6961
Epoch 100/200
266/266 [==============================] - 1s 4ms/step - loss: 0.5853 - accuracy: 0.8555 - val_loss: 1.1853 - val_accuracy: 0.7068
Epoch 101/200
266/266 [==============================] - 1s 4ms/step - loss: 0.5773 - accuracy: 0.8570 - val_loss: 1.2687 - val_accuracy: 0.6961
Epoch 102/200
266/266 [==============================] - 1s 4ms/step - loss: 0.5895 - accuracy: 0.8535 - val_loss: 1.2656 - val_accuracy: 0.6875
Epoch 103/200
266/266 [==============================] - 1s 4ms/step - loss: 0.5782 - accuracy: 0.8561 - val_loss: 1.4653 - val_accuracy: 0.6642
Epoch 104/200
266/266 [==============================] - 1s 4ms/step - loss: 0.5719 - accuracy: 0.8533 - val_loss: 1.2629 - val_accuracy: 0.6928
Epoch 105/200
266/266 [==============================] - 1s 4ms/step - loss: 0.5728 - accuracy: 0.8543 - val_loss: 1.3407 - val_accuracy: 0.6809
Epoch 106/200
266/266 [==============================] - 1s 4ms/step - loss: 0.5521 - accuracy: 0.8649 - val_loss: 1.1662 - val_accuracy: 0.7194
Epoch 107/200
266/266 [==============================] - 1s 4ms/step - loss: 0.5786 - accuracy: 0.8525 - val_loss: 1.2450 - val_accuracy: 0.6948
Epoch 108/200
266/266 [==============================] - 1s 4ms/step - loss: 0.5574 - accuracy: 0.8617 - val_loss: 1.4055 - val_accuracy: 0.6749
Epoch 109/200
266/266 [==============================] - 1s 4ms/step - loss: 0.5499 - accuracy: 0.8583 - val_loss: 1.3186 - val_accuracy: 0.6961
Epoch 110/200
266/266 [==============================] - 1s 4ms/step - loss: 0.5500 - accuracy: 0.8649 - val_loss: 1.2267 - val_accuracy: 0.7108
Epoch 111/200
266/266 [==============================] - 1s 4ms/step - loss: 0.5669 - accuracy: 0.8534 - val_loss: 1.1660 - val_accuracy: 0.7168
Epoch 112/200
266/266 [==============================] - 1s 4ms/step - loss: 0.5290 - accuracy: 0.8692 - val_loss: 1.2315 - val_accuracy: 0.7094
Epoch 113/200
266/266 [==============================] - 1s 4ms/step - loss: 0.5519 - accuracy: 0.8648 - val_loss: 1.2171 - val_accuracy: 0.6988
Epoch 114/200
266/266 [==============================] - 1s 4ms/step - loss: 0.5192 - accuracy: 0.8710 - val_loss: 1.2463 - val_accuracy: 0.7048
Epoch 115/200
266/266 [==============================] - 1s 4ms/step - loss: 0.5351 - accuracy: 0.8701 - val_loss: 1.1934 - val_accuracy: 0.7161
Epoch 116/200
266/266 [==============================] - 1s 4ms/step - loss: 0.5397 - accuracy: 0.8648 - val_loss: 1.2020 - val_accuracy: 0.7068
Epoch 117/200
266/266 [==============================] - 1s 4ms/step - loss: 0.5356 - accuracy: 0.8663 - val_loss: 1.3067 - val_accuracy: 0.6955
Epoch 118/200
266/266 [==============================] - 1s 4ms/step - loss: 0.5005 - accuracy: 0.8788 - val_loss: 1.2942 - val_accuracy: 0.7015
Epoch 119/200
266/266 [==============================] - 1s 4ms/step - loss: 0.5229 - accuracy: 0.8720 - val_loss: 1.2442 - val_accuracy: 0.7041
Epoch 120/200
266/266 [==============================] - 1s 4ms/step - loss: 0.5164 - accuracy: 0.8704 - val_loss: 1.2539 - val_accuracy: 0.7015
Epoch 121/200
266/266 [==============================] - 1s 4ms/step - loss: 0.4978 - accuracy: 0.8849 - val_loss: 1.2031 - val_accuracy: 0.7108
Epoch 122/200
266/266 [==============================] - 1s 4ms/step - loss: 0.5091 - accuracy: 0.8743 - val_loss: 1.3057 - val_accuracy: 0.6882
Epoch 123/200
266/266 [==============================] - 1s 4ms/step - loss: 0.4896 - accuracy: 0.8841 - val_loss: 1.2677 - val_accuracy: 0.7021
Epoch 124/200
266/266 [==============================] - 1s 4ms/step - loss: 0.5147 - accuracy: 0.8724 - val_loss: 1.2925 - val_accuracy: 0.7008
Epoch 125/200
266/266 [==============================] - 1s 4ms/step - loss: 0.4857 - accuracy: 0.8805 - val_loss: 1.2648 - val_accuracy: 0.7008
Epoch 126/200
266/266 [==============================] - 1s 4ms/step - loss: 0.5041 - accuracy: 0.8806 - val_loss: 1.2280 - val_accuracy: 0.7021
Epoch 127/200
266/266 [==============================] - 1s 4ms/step - loss: 0.4847 - accuracy: 0.8833 - val_loss: 1.3583 - val_accuracy: 0.6848
Epoch 128/200
266/266 [==============================] - 1s 4ms/step - loss: 0.4849 - accuracy: 0.8878 - val_loss: 1.3009 - val_accuracy: 0.6902
Epoch 129/200
266/266 [==============================] - 1s 4ms/step - loss: 0.4913 - accuracy: 0.8830 - val_loss: 1.2415 - val_accuracy: 0.7055
Epoch 130/200
266/266 [==============================] - 1s 4ms/step - loss: 0.4864 - accuracy: 0.8826 - val_loss: 1.3134 - val_accuracy: 0.6941
Epoch 131/200
266/266 [==============================] - 1s 4ms/step - loss: 0.4875 - accuracy: 0.8886 - val_loss: 1.2766 - val_accuracy: 0.7088
In [ ]:
loss, accuracy = model_report(CNN1_MODEL_OPTIMIZED, CNN1_MODEL_OPTIMIZED_history)
losses_opt["CNN1"] = loss
accuracies_opt["CNN1"] = accuracy
Test set evaluation metrics
---------------------------
Loss:     1.166
Accuracy: 72.867%

CNN2

In [ ]:
def init_cnn2_model_optimized(summary, optimizer = tf.optimizers.Adam, lr = 0.00005, classes_num = 20):
  model = models.Sequential()

  model.add(layers.Conv2D(32, (3, 3), kernel_regularizer=l2(0.001), padding="same", input_shape=(32, 32, 3))) 
  model.add(layers.BatchNormalization())
  model.add(layers.ReLU())
  model.add(layers.MaxPooling2D((2, 2)))
  model.add(layers.Dropout(0.2))

  model.add(layers.Conv2D(64, (3, 3), kernel_regularizer=l2(0.01), padding="same"))
  model.add(layers.BatchNormalization())
  model.add(layers.ReLU())
  model.add(layers.MaxPooling2D((2, 2)))
  model.add(layers.Dropout(0.2))

  model.add(layers.Conv2D(128, (3, 3), kernel_regularizer=l2(0.01), padding="same"))
  model.add(layers.BatchNormalization())
  model.add(layers.ReLU())
  model.add(layers.MaxPooling2D((2, 2)))
  model.add(layers.Dropout(0.2))

  model.add(layers.Conv2D(256, (3, 3), kernel_regularizer=l2(0.01), padding="same"))
  model.add(layers.BatchNormalization())
  model.add(layers.ReLU())
  model.add(layers.Dropout(0.2))
  
  model.add(layers.Flatten())
  model.add(layers.Dense(512,activation='relu'))
  model.add(layers.Dropout(0.4))
  model.add(layers.Dense(classes_num,activation='softmax'))

  model.compile(optimizer=optimizer(learning_rate = lr), loss=tf.keras.losses.sparse_categorical_crossentropy, metrics=["accuracy"])
  if summary: 
    model.summary()
  return model
In [ ]:
CNN2_MODEL_OPTIMIZED = init_cnn2_model_optimized(summary = True)
CNN2_MODEL_OPTIMIZED_history = train_model(CNN2_MODEL_OPTIMIZED, epochs = 200, callbacks=[callback])
Model: "sequential_3"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
conv2d_9 (Conv2D)            (None, 32, 32, 32)        896       
_________________________________________________________________
batch_normalization_9 (Batch (None, 32, 32, 32)        128       
_________________________________________________________________
re_lu_9 (ReLU)               (None, 32, 32, 32)        0         
_________________________________________________________________
max_pooling2d_6 (MaxPooling2 (None, 16, 16, 32)        0         
_________________________________________________________________
dropout_10 (Dropout)         (None, 16, 16, 32)        0         
_________________________________________________________________
conv2d_10 (Conv2D)           (None, 16, 16, 64)        18496     
_________________________________________________________________
batch_normalization_10 (Batc (None, 16, 16, 64)        256       
_________________________________________________________________
re_lu_10 (ReLU)              (None, 16, 16, 64)        0         
_________________________________________________________________
max_pooling2d_7 (MaxPooling2 (None, 8, 8, 64)          0         
_________________________________________________________________
dropout_11 (Dropout)         (None, 8, 8, 64)          0         
_________________________________________________________________
conv2d_11 (Conv2D)           (None, 8, 8, 128)         73856     
_________________________________________________________________
batch_normalization_11 (Batc (None, 8, 8, 128)         512       
_________________________________________________________________
re_lu_11 (ReLU)              (None, 8, 8, 128)         0         
_________________________________________________________________
max_pooling2d_8 (MaxPooling2 (None, 4, 4, 128)         0         
_________________________________________________________________
dropout_12 (Dropout)         (None, 4, 4, 128)         0         
_________________________________________________________________
conv2d_12 (Conv2D)           (None, 4, 4, 256)         295168    
_________________________________________________________________
batch_normalization_12 (Batc (None, 4, 4, 256)         1024      
_________________________________________________________________
re_lu_12 (ReLU)              (None, 4, 4, 256)         0         
_________________________________________________________________
dropout_13 (Dropout)         (None, 4, 4, 256)         0         
_________________________________________________________________
flatten_3 (Flatten)          (None, 4096)              0         
_________________________________________________________________
dense_6 (Dense)              (None, 512)               2097664   
_________________________________________________________________
dropout_14 (Dropout)         (None, 512)               0         
_________________________________________________________________
dense_7 (Dense)              (None, 20)                10260     
=================================================================
Total params: 2,498,260
Trainable params: 2,497,300
Non-trainable params: 960
_________________________________________________________________
Epoch 1/200
266/266 [==============================] - 3s 6ms/step - loss: 6.0085 - accuracy: 0.1176 - val_loss: 6.5021 - val_accuracy: 0.0519
Epoch 2/200
266/266 [==============================] - 1s 5ms/step - loss: 5.2533 - accuracy: 0.2374 - val_loss: 5.5729 - val_accuracy: 0.1390
Epoch 3/200
266/266 [==============================] - 1s 5ms/step - loss: 4.8535 - accuracy: 0.2993 - val_loss: 4.9882 - val_accuracy: 0.2340
Epoch 4/200
266/266 [==============================] - 1s 5ms/step - loss: 4.5100 - accuracy: 0.3433 - val_loss: 4.5659 - val_accuracy: 0.2886
Epoch 5/200
266/266 [==============================] - 1s 5ms/step - loss: 4.1904 - accuracy: 0.3810 - val_loss: 4.2495 - val_accuracy: 0.3338
Epoch 6/200
266/266 [==============================] - 1s 5ms/step - loss: 3.8831 - accuracy: 0.4130 - val_loss: 4.2672 - val_accuracy: 0.3145
Epoch 7/200
266/266 [==============================] - 1s 5ms/step - loss: 3.6186 - accuracy: 0.4486 - val_loss: 4.2179 - val_accuracy: 0.3245
Epoch 8/200
266/266 [==============================] - 1s 5ms/step - loss: 3.3751 - accuracy: 0.4819 - val_loss: 3.7818 - val_accuracy: 0.3737
Epoch 9/200
266/266 [==============================] - 1s 5ms/step - loss: 3.1720 - accuracy: 0.5012 - val_loss: 4.0831 - val_accuracy: 0.3158
Epoch 10/200
266/266 [==============================] - 1s 5ms/step - loss: 3.0236 - accuracy: 0.5098 - val_loss: 3.2821 - val_accuracy: 0.4455
Epoch 11/200
266/266 [==============================] - 1s 5ms/step - loss: 2.8668 - accuracy: 0.5328 - val_loss: 3.4719 - val_accuracy: 0.4016
Epoch 12/200
266/266 [==============================] - 1s 5ms/step - loss: 2.6997 - accuracy: 0.5554 - val_loss: 3.1689 - val_accuracy: 0.4588
Epoch 13/200
266/266 [==============================] - 1s 5ms/step - loss: 2.5975 - accuracy: 0.5519 - val_loss: 3.0835 - val_accuracy: 0.4541
Epoch 14/200
266/266 [==============================] - 1s 5ms/step - loss: 2.4478 - accuracy: 0.5811 - val_loss: 3.0492 - val_accuracy: 0.4535
Epoch 15/200
266/266 [==============================] - 1s 5ms/step - loss: 2.3233 - accuracy: 0.5985 - val_loss: 3.1088 - val_accuracy: 0.4242
Epoch 16/200
266/266 [==============================] - 1s 5ms/step - loss: 2.1963 - accuracy: 0.6136 - val_loss: 2.7000 - val_accuracy: 0.4887
Epoch 17/200
266/266 [==============================] - 1s 5ms/step - loss: 2.1146 - accuracy: 0.6108 - val_loss: 2.9068 - val_accuracy: 0.4508
Epoch 18/200
266/266 [==============================] - 1s 5ms/step - loss: 2.0370 - accuracy: 0.6233 - val_loss: 2.4400 - val_accuracy: 0.5412
Epoch 19/200
266/266 [==============================] - 1s 5ms/step - loss: 1.9289 - accuracy: 0.6471 - val_loss: 2.4837 - val_accuracy: 0.5246
Epoch 20/200
266/266 [==============================] - 1s 5ms/step - loss: 1.8697 - accuracy: 0.6518 - val_loss: 2.3227 - val_accuracy: 0.5406
Epoch 21/200
266/266 [==============================] - 1s 5ms/step - loss: 1.7694 - accuracy: 0.6665 - val_loss: 2.4993 - val_accuracy: 0.5066
Epoch 22/200
266/266 [==============================] - 1s 5ms/step - loss: 1.7298 - accuracy: 0.6708 - val_loss: 2.4610 - val_accuracy: 0.5007
Epoch 23/200
266/266 [==============================] - 1s 5ms/step - loss: 1.6348 - accuracy: 0.6939 - val_loss: 2.2486 - val_accuracy: 0.5512
Epoch 24/200
266/266 [==============================] - 1s 5ms/step - loss: 1.5646 - accuracy: 0.7027 - val_loss: 2.1906 - val_accuracy: 0.5479
Epoch 25/200
266/266 [==============================] - 1s 5ms/step - loss: 1.4996 - accuracy: 0.7141 - val_loss: 2.0498 - val_accuracy: 0.5851
Epoch 26/200
266/266 [==============================] - 1s 5ms/step - loss: 1.4687 - accuracy: 0.7137 - val_loss: 2.2388 - val_accuracy: 0.5525
Epoch 27/200
266/266 [==============================] - 1s 5ms/step - loss: 1.4006 - accuracy: 0.7329 - val_loss: 2.3473 - val_accuracy: 0.5259
Epoch 28/200
266/266 [==============================] - 1s 5ms/step - loss: 1.3729 - accuracy: 0.7288 - val_loss: 1.9856 - val_accuracy: 0.5824
Epoch 29/200
266/266 [==============================] - 1s 5ms/step - loss: 1.2972 - accuracy: 0.7461 - val_loss: 1.8760 - val_accuracy: 0.6157
Epoch 30/200
266/266 [==============================] - 1s 5ms/step - loss: 1.2888 - accuracy: 0.7426 - val_loss: 1.8719 - val_accuracy: 0.6190
Epoch 31/200
266/266 [==============================] - 1s 5ms/step - loss: 1.2308 - accuracy: 0.7584 - val_loss: 1.9089 - val_accuracy: 0.6084
Epoch 32/200
266/266 [==============================] - 1s 5ms/step - loss: 1.1673 - accuracy: 0.7759 - val_loss: 1.7726 - val_accuracy: 0.6250
Epoch 33/200
266/266 [==============================] - 1s 5ms/step - loss: 1.1444 - accuracy: 0.7657 - val_loss: 1.9233 - val_accuracy: 0.6004
Epoch 34/200
266/266 [==============================] - 1s 5ms/step - loss: 1.1231 - accuracy: 0.7793 - val_loss: 1.9907 - val_accuracy: 0.5818
Epoch 35/200
266/266 [==============================] - 1s 5ms/step - loss: 1.0622 - accuracy: 0.7946 - val_loss: 1.6364 - val_accuracy: 0.6496
Epoch 36/200
266/266 [==============================] - 1s 5ms/step - loss: 1.0504 - accuracy: 0.7927 - val_loss: 1.7129 - val_accuracy: 0.6336
Epoch 37/200
266/266 [==============================] - 1s 5ms/step - loss: 1.0329 - accuracy: 0.7936 - val_loss: 1.8592 - val_accuracy: 0.5997
Epoch 38/200
266/266 [==============================] - 1s 5ms/step - loss: 0.9922 - accuracy: 0.7984 - val_loss: 1.7118 - val_accuracy: 0.6423
Epoch 39/200
266/266 [==============================] - 1s 5ms/step - loss: 0.9670 - accuracy: 0.8128 - val_loss: 1.6871 - val_accuracy: 0.6523
Epoch 40/200
266/266 [==============================] - 1s 5ms/step - loss: 0.9442 - accuracy: 0.8199 - val_loss: 1.9142 - val_accuracy: 0.6004
Epoch 41/200
266/266 [==============================] - 1s 5ms/step - loss: 0.8818 - accuracy: 0.8310 - val_loss: 1.6702 - val_accuracy: 0.6576
Epoch 42/200
266/266 [==============================] - 1s 5ms/step - loss: 0.8863 - accuracy: 0.8244 - val_loss: 1.5575 - val_accuracy: 0.6609
Epoch 43/200
266/266 [==============================] - 1s 5ms/step - loss: 0.8418 - accuracy: 0.8411 - val_loss: 1.6757 - val_accuracy: 0.6476
Epoch 44/200
266/266 [==============================] - 1s 5ms/step - loss: 0.8436 - accuracy: 0.8398 - val_loss: 1.6933 - val_accuracy: 0.6350
Epoch 45/200
266/266 [==============================] - 1s 5ms/step - loss: 0.8272 - accuracy: 0.8406 - val_loss: 1.5936 - val_accuracy: 0.6483
Epoch 46/200
266/266 [==============================] - 1s 5ms/step - loss: 0.7921 - accuracy: 0.8460 - val_loss: 1.8341 - val_accuracy: 0.6243
Epoch 47/200
266/266 [==============================] - 1s 5ms/step - loss: 0.7627 - accuracy: 0.8563 - val_loss: 1.4884 - val_accuracy: 0.6828
Epoch 48/200
266/266 [==============================] - 1s 5ms/step - loss: 0.7657 - accuracy: 0.8543 - val_loss: 1.7062 - val_accuracy: 0.6343
Epoch 49/200
266/266 [==============================] - 1s 5ms/step - loss: 0.7408 - accuracy: 0.8611 - val_loss: 1.6081 - val_accuracy: 0.6596
Epoch 50/200
266/266 [==============================] - 1s 5ms/step - loss: 0.7071 - accuracy: 0.8698 - val_loss: 1.7828 - val_accuracy: 0.6283
Epoch 51/200
266/266 [==============================] - 1s 5ms/step - loss: 0.7150 - accuracy: 0.8694 - val_loss: 1.5510 - val_accuracy: 0.6556
Epoch 52/200
266/266 [==============================] - 1s 5ms/step - loss: 0.7295 - accuracy: 0.8622 - val_loss: 1.5050 - val_accuracy: 0.6749
Epoch 53/200
266/266 [==============================] - 1s 5ms/step - loss: 0.6767 - accuracy: 0.8755 - val_loss: 1.8027 - val_accuracy: 0.6350
Epoch 54/200
266/266 [==============================] - 1s 5ms/step - loss: 0.6738 - accuracy: 0.8761 - val_loss: 1.5247 - val_accuracy: 0.6782
Epoch 55/200
266/266 [==============================] - 1s 5ms/step - loss: 0.6519 - accuracy: 0.8832 - val_loss: 1.5472 - val_accuracy: 0.6616
Epoch 56/200
266/266 [==============================] - 1s 5ms/step - loss: 0.6428 - accuracy: 0.8812 - val_loss: 1.5564 - val_accuracy: 0.6749
Epoch 57/200
266/266 [==============================] - 1s 5ms/step - loss: 0.6411 - accuracy: 0.8871 - val_loss: 1.5291 - val_accuracy: 0.6742
Epoch 58/200
266/266 [==============================] - 1s 5ms/step - loss: 0.6223 - accuracy: 0.8852 - val_loss: 1.6338 - val_accuracy: 0.6616
Epoch 59/200
266/266 [==============================] - 1s 5ms/step - loss: 0.6006 - accuracy: 0.8948 - val_loss: 1.5293 - val_accuracy: 0.6848
Epoch 60/200
266/266 [==============================] - 1s 6ms/step - loss: 0.6087 - accuracy: 0.8922 - val_loss: 1.5647 - val_accuracy: 0.6715
Epoch 61/200
266/266 [==============================] - 1s 5ms/step - loss: 0.6025 - accuracy: 0.8911 - val_loss: 1.6821 - val_accuracy: 0.6469
Epoch 62/200
266/266 [==============================] - 1s 5ms/step - loss: 0.5686 - accuracy: 0.9023 - val_loss: 1.4737 - val_accuracy: 0.6981
Epoch 63/200
266/266 [==============================] - 1s 5ms/step - loss: 0.5653 - accuracy: 0.9042 - val_loss: 1.5711 - val_accuracy: 0.6682
Epoch 64/200
266/266 [==============================] - 1s 5ms/step - loss: 0.5668 - accuracy: 0.9000 - val_loss: 1.6850 - val_accuracy: 0.6616
Epoch 65/200
266/266 [==============================] - 1s 5ms/step - loss: 0.5605 - accuracy: 0.9002 - val_loss: 1.7254 - val_accuracy: 0.6436
Epoch 66/200
266/266 [==============================] - 1s 5ms/step - loss: 0.5660 - accuracy: 0.8955 - val_loss: 1.5492 - val_accuracy: 0.6755
Epoch 67/200
266/266 [==============================] - 1s 5ms/step - loss: 0.5461 - accuracy: 0.8972 - val_loss: 1.5217 - val_accuracy: 0.6809
Epoch 68/200
266/266 [==============================] - 1s 5ms/step - loss: 0.5259 - accuracy: 0.9137 - val_loss: 1.4815 - val_accuracy: 0.6888
Epoch 69/200
266/266 [==============================] - 1s 5ms/step - loss: 0.5145 - accuracy: 0.9111 - val_loss: 1.5661 - val_accuracy: 0.6755
Epoch 70/200
266/266 [==============================] - 1s 5ms/step - loss: 0.5171 - accuracy: 0.9128 - val_loss: 1.7396 - val_accuracy: 0.6529
Epoch 71/200
266/266 [==============================] - 1s 5ms/step - loss: 0.5200 - accuracy: 0.9118 - val_loss: 1.4266 - val_accuracy: 0.6868
Epoch 72/200
266/266 [==============================] - 1s 5ms/step - loss: 0.5001 - accuracy: 0.9176 - val_loss: 1.6071 - val_accuracy: 0.6769
Epoch 73/200
266/266 [==============================] - 1s 5ms/step - loss: 0.5117 - accuracy: 0.9076 - val_loss: 1.7266 - val_accuracy: 0.6516
Epoch 74/200
266/266 [==============================] - 1s 6ms/step - loss: 0.4884 - accuracy: 0.9158 - val_loss: 1.4595 - val_accuracy: 0.6941
Epoch 75/200
266/266 [==============================] - 1s 6ms/step - loss: 0.4708 - accuracy: 0.9228 - val_loss: 1.4992 - val_accuracy: 0.6888
Epoch 76/200
266/266 [==============================] - 1s 5ms/step - loss: 0.4936 - accuracy: 0.9190 - val_loss: 1.5836 - val_accuracy: 0.6822
Epoch 77/200
266/266 [==============================] - 1s 6ms/step - loss: 0.4778 - accuracy: 0.9176 - val_loss: 1.6307 - val_accuracy: 0.6722
Epoch 78/200
266/266 [==============================] - 1s 6ms/step - loss: 0.4678 - accuracy: 0.9282 - val_loss: 1.5790 - val_accuracy: 0.6828
Epoch 79/200
266/266 [==============================] - 1s 6ms/step - loss: 0.4546 - accuracy: 0.9289 - val_loss: 1.4057 - val_accuracy: 0.7088
Epoch 80/200
266/266 [==============================] - 2s 6ms/step - loss: 0.4665 - accuracy: 0.9240 - val_loss: 1.5759 - val_accuracy: 0.6961
Epoch 81/200
266/266 [==============================] - 1s 5ms/step - loss: 0.4675 - accuracy: 0.9256 - val_loss: 1.5304 - val_accuracy: 0.6888
Epoch 82/200
266/266 [==============================] - 1s 5ms/step - loss: 0.4559 - accuracy: 0.9258 - val_loss: 1.5042 - val_accuracy: 0.7035
Epoch 83/200
266/266 [==============================] - 1s 5ms/step - loss: 0.4548 - accuracy: 0.9249 - val_loss: 1.6583 - val_accuracy: 0.6656
Epoch 84/200
266/266 [==============================] - 1s 5ms/step - loss: 0.4593 - accuracy: 0.9215 - val_loss: 1.4063 - val_accuracy: 0.7048
Epoch 85/200
266/266 [==============================] - 1s 5ms/step - loss: 0.4616 - accuracy: 0.9196 - val_loss: 1.4550 - val_accuracy: 0.7055
Epoch 86/200
266/266 [==============================] - 1s 5ms/step - loss: 0.4458 - accuracy: 0.9295 - val_loss: 1.5999 - val_accuracy: 0.6789
Epoch 87/200
266/266 [==============================] - 1s 5ms/step - loss: 0.4469 - accuracy: 0.9269 - val_loss: 1.6171 - val_accuracy: 0.6828
Epoch 88/200
266/266 [==============================] - 1s 5ms/step - loss: 0.4374 - accuracy: 0.9330 - val_loss: 1.5255 - val_accuracy: 0.7061
Epoch 89/200
266/266 [==============================] - 1s 5ms/step - loss: 0.4410 - accuracy: 0.9291 - val_loss: 1.4785 - val_accuracy: 0.6908
Epoch 90/200
266/266 [==============================] - 1s 5ms/step - loss: 0.4323 - accuracy: 0.9280 - val_loss: 1.6111 - val_accuracy: 0.6802
Epoch 91/200
266/266 [==============================] - 1s 5ms/step - loss: 0.4204 - accuracy: 0.9379 - val_loss: 1.4723 - val_accuracy: 0.6995
Epoch 92/200
266/266 [==============================] - 1s 5ms/step - loss: 0.4140 - accuracy: 0.9398 - val_loss: 1.6118 - val_accuracy: 0.6822
Epoch 93/200
266/266 [==============================] - 1s 5ms/step - loss: 0.4210 - accuracy: 0.9361 - val_loss: 1.4412 - val_accuracy: 0.7035
Epoch 94/200
266/266 [==============================] - 1s 5ms/step - loss: 0.4095 - accuracy: 0.9383 - val_loss: 1.8642 - val_accuracy: 0.6616
Epoch 95/200
266/266 [==============================] - 1s 5ms/step - loss: 0.4160 - accuracy: 0.9347 - val_loss: 1.5704 - val_accuracy: 0.6828
Epoch 96/200
266/266 [==============================] - 1s 5ms/step - loss: 0.3919 - accuracy: 0.9460 - val_loss: 1.5274 - val_accuracy: 0.6875
Epoch 97/200
266/266 [==============================] - 1s 5ms/step - loss: 0.4014 - accuracy: 0.9378 - val_loss: 1.4702 - val_accuracy: 0.7001
Epoch 98/200
266/266 [==============================] - 1s 5ms/step - loss: 0.4029 - accuracy: 0.9363 - val_loss: 1.8350 - val_accuracy: 0.6742
Epoch 99/200
266/266 [==============================] - 1s 5ms/step - loss: 0.4129 - accuracy: 0.9374 - val_loss: 1.5638 - val_accuracy: 0.7074
In [ ]:
loss, accuracy = model_report(CNN2_MODEL_OPTIMIZED, CNN2_MODEL_OPTIMIZED_history)
losses_opt["CNN2"] = loss
accuracies_opt["CNN2"] = accuracy
Test set evaluation metrics
---------------------------
Loss:     1.391
Accuracy: 71.528%

Μεταφορά μάθησης

VGG16

In [ ]:
# transfer learning: VGG16 trained on ImageNet without the top layer

def init_VGG16_model_optimized(summary, optimizer = tf.optimizers.Adam, lr = 0.00005, classes_num = 20):
  VGG16_MODEL=tf.keras.applications.VGG16(input_shape=(32,32,3), include_top=False, weights='imagenet')
  
  # unfreeze conv layers
  VGG16_MODEL.trainable=True
  
  dropout_layer = tf.keras.layers.Dropout(rate = 0.5)
  global_average_layer = tf.keras.layers.GlobalAveragePooling2D()

  # add top layer for CIFAR100 classification
  prediction_layer = tf.keras.layers.Dense(classes_num,activation='softmax')
  model = tf.keras.Sequential([VGG16_MODEL, dropout_layer, global_average_layer, prediction_layer])
  model.compile(optimizer=optimizer(learning_rate = lr), loss=tf.keras.losses.sparse_categorical_crossentropy, metrics=["accuracy"])
  if summary: 
    model.summary()
  return model
In [ ]:
VGG16_MODEL_OPTIMIZED = init_VGG16_model_optimized(True)
VGG16_MODEL_OPTIMIZED_history = train_model(VGG16_MODEL_OPTIMIZED, epochs = 200, callbacks = [callback])
Model: "sequential_13"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
vgg16 (Functional)           (None, 1, 1, 512)         14714688  
_________________________________________________________________
dropout_10 (Dropout)         (None, 1, 1, 512)         0         
_________________________________________________________________
global_average_pooling2d_10  (None, 512)               0         
_________________________________________________________________
dense_17 (Dense)             (None, 20)                10260     
=================================================================
Total params: 14,724,948
Trainable params: 14,724,948
Non-trainable params: 0
_________________________________________________________________
Epoch 1/200
266/266 [==============================] - 9s 30ms/step - loss: 2.8043 - accuracy: 0.1621 - val_loss: 1.7364 - val_accuracy: 0.5020
Epoch 2/200
266/266 [==============================] - 8s 29ms/step - loss: 1.6042 - accuracy: 0.5234 - val_loss: 1.2300 - val_accuracy: 0.6336
Epoch 3/200
266/266 [==============================] - 8s 30ms/step - loss: 1.0764 - accuracy: 0.6965 - val_loss: 1.1417 - val_accuracy: 0.6582
Epoch 4/200
266/266 [==============================] - 8s 30ms/step - loss: 0.7743 - accuracy: 0.7732 - val_loss: 1.0391 - val_accuracy: 0.7134
Epoch 5/200
266/266 [==============================] - 8s 30ms/step - loss: 0.5791 - accuracy: 0.8324 - val_loss: 0.9489 - val_accuracy: 0.7480
Epoch 6/200
266/266 [==============================] - 8s 30ms/step - loss: 0.4153 - accuracy: 0.8801 - val_loss: 1.0559 - val_accuracy: 0.7261
Epoch 7/200
266/266 [==============================] - 8s 30ms/step - loss: 0.3097 - accuracy: 0.9103 - val_loss: 1.1142 - val_accuracy: 0.7420
Epoch 8/200
266/266 [==============================] - 8s 30ms/step - loss: 0.2312 - accuracy: 0.9335 - val_loss: 1.1370 - val_accuracy: 0.7467
Epoch 9/200
266/266 [==============================] - 8s 30ms/step - loss: 0.1949 - accuracy: 0.9465 - val_loss: 1.0496 - val_accuracy: 0.7680
Epoch 10/200
266/266 [==============================] - 8s 30ms/step - loss: 0.1257 - accuracy: 0.9667 - val_loss: 1.2414 - val_accuracy: 0.7434
Epoch 11/200
266/266 [==============================] - 8s 30ms/step - loss: 0.1355 - accuracy: 0.9628 - val_loss: 1.2067 - val_accuracy: 0.7566
Epoch 12/200
266/266 [==============================] - 8s 30ms/step - loss: 0.1144 - accuracy: 0.9679 - val_loss: 1.1631 - val_accuracy: 0.7527
Epoch 13/200
266/266 [==============================] - 8s 30ms/step - loss: 0.0622 - accuracy: 0.9828 - val_loss: 1.2158 - val_accuracy: 0.7520
Epoch 14/200
266/266 [==============================] - 8s 30ms/step - loss: 0.0808 - accuracy: 0.9807 - val_loss: 1.3252 - val_accuracy: 0.7247
Epoch 15/200
266/266 [==============================] - 8s 30ms/step - loss: 0.1186 - accuracy: 0.9699 - val_loss: 1.0913 - val_accuracy: 0.7759
Epoch 16/200
266/266 [==============================] - 8s 30ms/step - loss: 0.0607 - accuracy: 0.9825 - val_loss: 1.2933 - val_accuracy: 0.7620
Epoch 17/200
266/266 [==============================] - 8s 30ms/step - loss: 0.0838 - accuracy: 0.9779 - val_loss: 1.3168 - val_accuracy: 0.7360
Epoch 18/200
266/266 [==============================] - 8s 30ms/step - loss: 0.0383 - accuracy: 0.9887 - val_loss: 1.3932 - val_accuracy: 0.7586
Epoch 19/200
266/266 [==============================] - 8s 30ms/step - loss: 0.0670 - accuracy: 0.9794 - val_loss: 1.4065 - val_accuracy: 0.7566
Epoch 20/200
266/266 [==============================] - 8s 30ms/step - loss: 0.0866 - accuracy: 0.9747 - val_loss: 1.1494 - val_accuracy: 0.7640
Epoch 21/200
266/266 [==============================] - 8s 30ms/step - loss: 0.0554 - accuracy: 0.9856 - val_loss: 1.2662 - val_accuracy: 0.7739
Epoch 22/200
266/266 [==============================] - 8s 30ms/step - loss: 0.0305 - accuracy: 0.9910 - val_loss: 1.3164 - val_accuracy: 0.7440
Epoch 23/200
266/266 [==============================] - 8s 30ms/step - loss: 0.0603 - accuracy: 0.9844 - val_loss: 1.4329 - val_accuracy: 0.7540
Epoch 24/200
266/266 [==============================] - 8s 30ms/step - loss: 0.0697 - accuracy: 0.9847 - val_loss: 1.1749 - val_accuracy: 0.7846
Epoch 25/200
266/266 [==============================] - 8s 30ms/step - loss: 0.0364 - accuracy: 0.9907 - val_loss: 1.5004 - val_accuracy: 0.7360
In [ ]:
loss, accuracy = model_report(VGG16_MODEL_OPTIMIZED, VGG16_MODEL_OPTIMIZED_history)
losses_opt["VGG_ALL"] = loss
accuracies_opt["VGG_ALL"] = accuracy
Test set evaluation metrics
---------------------------
Loss:     0.931
Accuracy: 74.504%

MobileNet

In [ ]:
#transfer learning: MobileNet trained on ImageNet without the top layer

def init_MobileNetV2_model_optimized(summary, optimizer = tf.optimizers.Adam, lr = 0.00005, classes_num = 20):
  mobilenetV2_model=tf.keras.applications.MobileNetV2(input_shape=(IMG_SIZE,IMG_SIZE,3), include_top=False, weights='imagenet')
  
  MobileNetV2_MODEL=mobilenetV2_model.layers[0](mobilenetV2_model)

  # unfreeze conv layers
  MobileNetV2_MODEL.trainable=True
  
  dropout_layer = tf.keras.layers.Dropout(rate = 0.5)
  global_average_layer = tf.keras.layers.GlobalAveragePooling2D()

  # add top layer for CIFAR100 classification
  prediction_layer = tf.keras.layers.Dense(classes_num,activation='softmax')
  model = tf.keras.Sequential([MobileNetV2_MODEL, dropout_layer, global_average_layer, prediction_layer])
  model.compile(optimizer=optimizer(learning_rate = lr), loss=tf.keras.losses.sparse_categorical_crossentropy, metrics=["accuracy"])
  if summary: 
    model.summary()
  return model
In [ ]:
MobileNetV2_MODEL_OPTIMIZED = init_MobileNetV2_model_optimized(True)
MobileNetV2_MODEL_OPTIMIZED_history = train_model(MobileNetV2_MODEL_OPTIMIZED, train_dataset = train_ds_res, validation_dataset = validation_ds_res, epochs = 200, callbacks=[callback])
Downloading data from https://storage.googleapis.com/tensorflow/keras-applications/mobilenet_v2/mobilenet_v2_weights_tf_dim_ordering_tf_kernels_1.0_224_no_top.h5
9412608/9406464 [==============================] - 0s 0us/step
Model: "sequential_5"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
mobilenetv2_1.00_224 (Functi (None, 7, 7, 1280)        2257984   
_________________________________________________________________
dropout_16 (Dropout)         (None, 7, 7, 1280)        0         
_________________________________________________________________
global_average_pooling2d_1 ( (None, 1280)              0         
_________________________________________________________________
dense_9 (Dense)              (None, 20)                25620     
=================================================================
Total params: 2,283,604
Trainable params: 2,249,492
Non-trainable params: 34,112
_________________________________________________________________
Epoch 1/200
266/266 [==============================] - 65s 231ms/step - loss: 1.7136 - accuracy: 0.5085 - val_loss: 2.1306 - val_accuracy: 0.4561
Epoch 2/200
266/266 [==============================] - 61s 228ms/step - loss: 0.3269 - accuracy: 0.9072 - val_loss: 2.6106 - val_accuracy: 0.3451
Epoch 3/200
266/266 [==============================] - 61s 229ms/step - loss: 0.1353 - accuracy: 0.9656 - val_loss: 2.8981 - val_accuracy: 0.3464
Epoch 4/200
266/266 [==============================] - 60s 225ms/step - loss: 0.0812 - accuracy: 0.9810 - val_loss: 2.3707 - val_accuracy: 0.4176
Epoch 5/200
266/266 [==============================] - 61s 228ms/step - loss: 0.0435 - accuracy: 0.9910 - val_loss: 2.7497 - val_accuracy: 0.3956
Epoch 6/200
266/266 [==============================] - 60s 227ms/step - loss: 0.0343 - accuracy: 0.9930 - val_loss: 2.3872 - val_accuracy: 0.4481
Epoch 7/200
266/266 [==============================] - 61s 229ms/step - loss: 0.0222 - accuracy: 0.9959 - val_loss: 1.7845 - val_accuracy: 0.5505
Epoch 8/200
266/266 [==============================] - 60s 226ms/step - loss: 0.0278 - accuracy: 0.9922 - val_loss: 1.6081 - val_accuracy: 0.6144
Epoch 9/200
266/266 [==============================] - 61s 230ms/step - loss: 0.0235 - accuracy: 0.9927 - val_loss: 1.5000 - val_accuracy: 0.6184
Epoch 10/200
266/266 [==============================] - 61s 228ms/step - loss: 0.0241 - accuracy: 0.9929 - val_loss: 1.2010 - val_accuracy: 0.6782
Epoch 11/200
266/266 [==============================] - 61s 228ms/step - loss: 0.0318 - accuracy: 0.9905 - val_loss: 0.8667 - val_accuracy: 0.7759
Epoch 12/200
266/266 [==============================] - 61s 229ms/step - loss: 0.0203 - accuracy: 0.9947 - val_loss: 0.7350 - val_accuracy: 0.8138
Epoch 13/200
266/266 [==============================] - 61s 229ms/step - loss: 0.0288 - accuracy: 0.9906 - val_loss: 0.9945 - val_accuracy: 0.7985
Epoch 14/200
266/266 [==============================] - 61s 230ms/step - loss: 0.0288 - accuracy: 0.9919 - val_loss: 0.9003 - val_accuracy: 0.8005
Epoch 15/200
266/266 [==============================] - 61s 230ms/step - loss: 0.0290 - accuracy: 0.9925 - val_loss: 0.9041 - val_accuracy: 0.8158
Epoch 16/200
266/266 [==============================] - 61s 228ms/step - loss: 0.0254 - accuracy: 0.9920 - val_loss: 0.7859 - val_accuracy: 0.8165
Epoch 17/200
266/266 [==============================] - 61s 229ms/step - loss: 0.0280 - accuracy: 0.9917 - val_loss: 0.7582 - val_accuracy: 0.8285
Epoch 18/200
266/266 [==============================] - 60s 224ms/step - loss: 0.0178 - accuracy: 0.9936 - val_loss: 0.6987 - val_accuracy: 0.8477
Epoch 19/200
266/266 [==============================] - 61s 228ms/step - loss: 0.0114 - accuracy: 0.9978 - val_loss: 0.8148 - val_accuracy: 0.8338
Epoch 20/200
266/266 [==============================] - 60s 226ms/step - loss: 0.0121 - accuracy: 0.9968 - val_loss: 0.6696 - val_accuracy: 0.8524
Epoch 21/200
266/266 [==============================] - 61s 228ms/step - loss: 0.0181 - accuracy: 0.9948 - val_loss: 0.6270 - val_accuracy: 0.8577
Epoch 22/200
266/266 [==============================] - 60s 227ms/step - loss: 0.0206 - accuracy: 0.9919 - val_loss: 0.9108 - val_accuracy: 0.7985
Epoch 23/200
266/266 [==============================] - 60s 227ms/step - loss: 0.0326 - accuracy: 0.9892 - val_loss: 0.7413 - val_accuracy: 0.8444
Epoch 24/200
266/266 [==============================] - 60s 227ms/step - loss: 0.0183 - accuracy: 0.9936 - val_loss: 0.7567 - val_accuracy: 0.8517
Epoch 25/200
266/266 [==============================] - 60s 227ms/step - loss: 0.0322 - accuracy: 0.9906 - val_loss: 0.7787 - val_accuracy: 0.8477
Epoch 26/200
266/266 [==============================] - 60s 226ms/step - loss: 0.0208 - accuracy: 0.9930 - val_loss: 0.6050 - val_accuracy: 0.8657
Epoch 27/200
266/266 [==============================] - 61s 229ms/step - loss: 0.0131 - accuracy: 0.9960 - val_loss: 0.5731 - val_accuracy: 0.8717
Epoch 28/200
266/266 [==============================] - 60s 227ms/step - loss: 0.0118 - accuracy: 0.9956 - val_loss: 0.6023 - val_accuracy: 0.8730
Epoch 29/200
266/266 [==============================] - 60s 227ms/step - loss: 0.0158 - accuracy: 0.9946 - val_loss: 0.7374 - val_accuracy: 0.8590
Epoch 30/200
266/266 [==============================] - 61s 229ms/step - loss: 0.0116 - accuracy: 0.9961 - val_loss: 0.6926 - val_accuracy: 0.8637
Epoch 31/200
266/266 [==============================] - 60s 227ms/step - loss: 0.0124 - accuracy: 0.9962 - val_loss: 0.8128 - val_accuracy: 0.8444
Epoch 32/200
266/266 [==============================] - 59s 222ms/step - loss: 0.0118 - accuracy: 0.9963 - val_loss: 0.6217 - val_accuracy: 0.8743
Epoch 33/200
266/266 [==============================] - 60s 227ms/step - loss: 0.0078 - accuracy: 0.9969 - val_loss: 0.7522 - val_accuracy: 0.8551
Epoch 34/200
266/266 [==============================] - 61s 228ms/step - loss: 0.0095 - accuracy: 0.9973 - val_loss: 0.9985 - val_accuracy: 0.7919
Epoch 35/200
266/266 [==============================] - 61s 228ms/step - loss: 0.0418 - accuracy: 0.9845 - val_loss: 0.9368 - val_accuracy: 0.7826
Epoch 36/200
266/266 [==============================] - 60s 227ms/step - loss: 0.0250 - accuracy: 0.9919 - val_loss: 1.0916 - val_accuracy: 0.7773
Epoch 37/200
266/266 [==============================] - 61s 228ms/step - loss: 0.0174 - accuracy: 0.9940 - val_loss: 0.9431 - val_accuracy: 0.8118
Epoch 38/200
266/266 [==============================] - 61s 228ms/step - loss: 0.0192 - accuracy: 0.9927 - val_loss: 0.8255 - val_accuracy: 0.8398
Epoch 39/200
266/266 [==============================] - 61s 228ms/step - loss: 0.0114 - accuracy: 0.9964 - val_loss: 0.7011 - val_accuracy: 0.8531
Epoch 40/200
266/266 [==============================] - 61s 229ms/step - loss: 0.0116 - accuracy: 0.9953 - val_loss: 0.6661 - val_accuracy: 0.8677
Epoch 41/200
266/266 [==============================] - 60s 227ms/step - loss: 0.0123 - accuracy: 0.9957 - val_loss: 0.6213 - val_accuracy: 0.8697
Epoch 42/200
266/266 [==============================] - 60s 227ms/step - loss: 0.0149 - accuracy: 0.9956 - val_loss: 0.7256 - val_accuracy: 0.8511
Epoch 43/200
266/266 [==============================] - 61s 230ms/step - loss: 0.0098 - accuracy: 0.9971 - val_loss: 0.7140 - val_accuracy: 0.8597
Epoch 44/200
266/266 [==============================] - 61s 229ms/step - loss: 0.0127 - accuracy: 0.9963 - val_loss: 0.7937 - val_accuracy: 0.8551
Epoch 45/200
266/266 [==============================] - 60s 227ms/step - loss: 0.0124 - accuracy: 0.9956 - val_loss: 0.7468 - val_accuracy: 0.8617
Epoch 46/200
266/266 [==============================] - 61s 228ms/step - loss: 0.0158 - accuracy: 0.9959 - val_loss: 0.7468 - val_accuracy: 0.8677
Epoch 47/200
266/266 [==============================] - 61s 230ms/step - loss: 0.0216 - accuracy: 0.9927 - val_loss: 0.9517 - val_accuracy: 0.8457
In [ ]:
loss, accuracy = model_report(MobileNetV2_MODEL_OPTIMIZED, MobileNetV2_MODEL_OPTIMIZED_history, test_ds_res)
losses_opt["MOBILENET_ALL"] = loss
accuracies_opt["MOBILENET_ALL"] = accuracy
Test set evaluation metrics
---------------------------
Loss:     0.619
Accuracy: 88.145%

DenseNet

In [ ]:
# transfer learning: DenseNet trained on ImageNet without the top layer

def init_DENSENET_model_optimized(summary, optimizer = tf.optimizers.Adam, lr = 0.00005, classes_num = 20):
  densenet_model=tf.keras.applications.densenet.DenseNet121(input_shape=(32,32,3), include_top=False, weights='imagenet')
  
  DENSENET_MODEL=densenet_model.layers[0](densenet_model)

  # unfreeze conv layers
  DENSENET_MODEL.trainable = True

  dropout_layer = tf.keras.layers.Dropout(rate = 0.5)
  global_average_layer = tf.keras.layers.GlobalAveragePooling2D()

  # add top layer for CIFAR100 classification
  prediction_layer = tf.keras.layers.Dense(classes_num,activation='softmax')
  model = tf.keras.Sequential([DENSENET_MODEL, dropout_layer, global_average_layer, prediction_layer])
  model.compile(optimizer=optimizer(learning_rate = lr), loss=tf.keras.losses.sparse_categorical_crossentropy, metrics=["accuracy"])
  if summary: 
    model.summary()
  return model
In [ ]:
DENSENET_MODEL_OPTIMIZED = init_DENSENET_model_optimized(True)
DENSENET_MODEL_OPTIMIZED_history = train_model(DENSENET_MODEL_OPTIMIZED, epochs = 200, callbacks=[callback])
Downloading data from https://storage.googleapis.com/tensorflow/keras-applications/densenet/densenet121_weights_tf_dim_ordering_tf_kernels_notop.h5
29089792/29084464 [==============================] - 0s 0us/step
Model: "sequential"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
densenet121 (Functional)     (None, 1, 1, 1024)        7037504   
_________________________________________________________________
dropout (Dropout)            (None, 1, 1, 1024)        0         
_________________________________________________________________
global_average_pooling2d (Gl (None, 1024)              0         
_________________________________________________________________
dense (Dense)                (None, 20)                20500     
=================================================================
Total params: 7,058,004
Trainable params: 6,974,356
Non-trainable params: 83,648
_________________________________________________________________
Epoch 1/200
266/266 [==============================] - 26s 37ms/step - loss: 3.6479 - accuracy: 0.1447 - val_loss: 1.8509 - val_accuracy: 0.5007
Epoch 2/200
266/266 [==============================] - 8s 31ms/step - loss: 1.8607 - accuracy: 0.4694 - val_loss: 1.2473 - val_accuracy: 0.6569
Epoch 3/200
266/266 [==============================] - 8s 31ms/step - loss: 1.3057 - accuracy: 0.6197 - val_loss: 1.0151 - val_accuracy: 0.7015
Epoch 4/200
266/266 [==============================] - 8s 31ms/step - loss: 1.0105 - accuracy: 0.6979 - val_loss: 0.9847 - val_accuracy: 0.7121
Epoch 5/200
266/266 [==============================] - 8s 31ms/step - loss: 0.8159 - accuracy: 0.7616 - val_loss: 0.9023 - val_accuracy: 0.7320
Epoch 6/200
266/266 [==============================] - 8s 31ms/step - loss: 0.6449 - accuracy: 0.8022 - val_loss: 0.9002 - val_accuracy: 0.7547
Epoch 7/200
266/266 [==============================] - 8s 32ms/step - loss: 0.5109 - accuracy: 0.8376 - val_loss: 0.9216 - val_accuracy: 0.7520
Epoch 8/200
266/266 [==============================] - 8s 31ms/step - loss: 0.4095 - accuracy: 0.8755 - val_loss: 0.8527 - val_accuracy: 0.7653
Epoch 9/200
266/266 [==============================] - 8s 31ms/step - loss: 0.3255 - accuracy: 0.9028 - val_loss: 0.8739 - val_accuracy: 0.7573
Epoch 10/200
266/266 [==============================] - 8s 31ms/step - loss: 0.2531 - accuracy: 0.9237 - val_loss: 0.9105 - val_accuracy: 0.7520
Epoch 11/200
266/266 [==============================] - 8s 30ms/step - loss: 0.2247 - accuracy: 0.9317 - val_loss: 0.9622 - val_accuracy: 0.7527
Epoch 12/200
266/266 [==============================] - 8s 31ms/step - loss: 0.2146 - accuracy: 0.9300 - val_loss: 0.8835 - val_accuracy: 0.7746
Epoch 13/200
266/266 [==============================] - 8s 31ms/step - loss: 0.1599 - accuracy: 0.9501 - val_loss: 0.9384 - val_accuracy: 0.7726
Epoch 14/200
266/266 [==============================] - 8s 31ms/step - loss: 0.1540 - accuracy: 0.9559 - val_loss: 1.2135 - val_accuracy: 0.7261
Epoch 15/200
266/266 [==============================] - 8s 32ms/step - loss: 0.1506 - accuracy: 0.9581 - val_loss: 0.9700 - val_accuracy: 0.7759
Epoch 16/200
266/266 [==============================] - 8s 31ms/step - loss: 0.1226 - accuracy: 0.9614 - val_loss: 0.9786 - val_accuracy: 0.7626
Epoch 17/200
266/266 [==============================] - 8s 31ms/step - loss: 0.1135 - accuracy: 0.9648 - val_loss: 1.0855 - val_accuracy: 0.7613
Epoch 18/200
266/266 [==============================] - 8s 31ms/step - loss: 0.1169 - accuracy: 0.9611 - val_loss: 0.9681 - val_accuracy: 0.7733
Epoch 19/200
266/266 [==============================] - 8s 31ms/step - loss: 0.1102 - accuracy: 0.9657 - val_loss: 1.0088 - val_accuracy: 0.7799
Epoch 20/200
266/266 [==============================] - 8s 30ms/step - loss: 0.0895 - accuracy: 0.9732 - val_loss: 1.0557 - val_accuracy: 0.7673
Epoch 21/200
266/266 [==============================] - 8s 30ms/step - loss: 0.0904 - accuracy: 0.9709 - val_loss: 1.0262 - val_accuracy: 0.7666
Epoch 22/200
266/266 [==============================] - 8s 31ms/step - loss: 0.0939 - accuracy: 0.9710 - val_loss: 1.0636 - val_accuracy: 0.7719
Epoch 23/200
266/266 [==============================] - 8s 30ms/step - loss: 0.0900 - accuracy: 0.9721 - val_loss: 1.0704 - val_accuracy: 0.7600
Epoch 24/200
266/266 [==============================] - 8s 30ms/step - loss: 0.1006 - accuracy: 0.9652 - val_loss: 1.0419 - val_accuracy: 0.7773
Epoch 25/200
266/266 [==============================] - 8s 31ms/step - loss: 0.0811 - accuracy: 0.9782 - val_loss: 0.9641 - val_accuracy: 0.7886
Epoch 26/200
266/266 [==============================] - 8s 30ms/step - loss: 0.0742 - accuracy: 0.9784 - val_loss: 0.9963 - val_accuracy: 0.7839
Epoch 27/200
266/266 [==============================] - 8s 31ms/step - loss: 0.0731 - accuracy: 0.9779 - val_loss: 0.9571 - val_accuracy: 0.7899
Epoch 28/200
266/266 [==============================] - 8s 31ms/step - loss: 0.0644 - accuracy: 0.9804 - val_loss: 1.0275 - val_accuracy: 0.7766
In [ ]:
loss, accuracy = model_report(DENSENET_MODEL_OPTIMIZED, DENSENET_MODEL_OPTIMIZED_history)
losses_opt["DENSENET_ALL"] = loss
accuracies_opt["DENSENET_ALL"] = accuracy
Test set evaluation metrics
---------------------------
Loss:     0.908
Accuracy: 75.496%

Bar plots σύγκρισης

In [ ]:
# set width of bar
barWidth = 0.15
model_names = ['Simple Model', 'CNN1', 'CNN2', 'VGG16', 'MobileNet', 'DenseNet']

# set height of bars
bar1 = [accuracies["SIMPLE_MODEL"],accuracies["CNN1"],accuracies["CNN2"],accuracies["VGG_ALL"],accuracies["MOBILENET_ALL"],accuracies["DENSENET_ALL"]]
bar2 = [accuracies_opt["SIMPLE_MODEL"],accuracies_opt["CNN1"],accuracies_opt["CNN2"],accuracies_opt["VGG_ALL"],accuracies_opt["MOBILENET_ALL"],accuracies_opt["DENSENET_ALL"]]

# Set position of bar on X axis
r1 = np.arange(6)
r2 = [x + barWidth for x in r1]


plt.figure(figsize=(12,5))
plt.bar(r1, bar1, color='#003f5c', width=barWidth, edgecolor='white', label = 'Initial models')
plt.bar(r2, bar2, color='#ffa600', width=barWidth, edgecolor='white', label = 'Optimized models')
plt.xticks([r + (barWidth/2) for r in range(6)], model_names)
plt.ylim(bottom=0.1)
plt.legend(loc='best')
plt.title("Comparison between non-optimized and optimized models")
plt.ylabel("Classification Accuracy")
plt.grid(axis="y", linestyle="--")
plt.show()

Παρατηρούμε πως τα περισσότερα βελτιστοποιημένα μοντέλα εμφανίζουν άνοδο στην επίδοση τους σε σχέση με τα αρχικά (μη-βελτιστοποιημένα). Η αύξηση αυτή είναι περισσότερο αισθητή στα from scratch μοντέλα όπου ανέρχεται γύρω στο 15%. Στο Transfer learning παρατηρούμε πως για τα VGG16 και DenseNet έχουμε μια μικρή πτώση στην ακρίβεια κατηγοριοποίησης. Αυτό οφείλεται στο γεγονός ότι κατά τη διαδικασία της βελτιστοποίησης χρησιμοποιούμε Early Stopping κάνοντας monitor το validation loss και όχι το accuracy. Επομένως, είναι πιθανό να λάβουμε τελικά ένα βελτιστοποιημένο μοντέλο με λίγο μικρότερη ακρίβεια αλλά παράλληλα μικρότερο σφάλμα ταξινόμησης. Κάτι τέτοιο είναι επιθυμητό καθώς το μοντέλο μπορεί να αποφανθεί με μεγαλύτερη βεβαιότητα για την κατηγοριοποίηση των εικόνων.

In [ ]:
# set width of bar
barWidth = 0.15
model_names = ['Simple Model', 'CNN1', 'CNN2', 'VGG16', 'MobileNet', 'DenseNet']

# set height of bars
bar1 = [losses["SIMPLE_MODEL"],losses["CNN1"],losses["CNN2"],losses["VGG_ALL"],losses["MOBILENET_ALL"],losses["DENSENET_ALL"]]
bar2 = [losses_opt["SIMPLE_MODEL"],losses_opt["CNN1"],losses_opt["CNN2"],losses_opt["VGG_ALL"],losses_opt["MOBILENET_ALL"],losses_opt["DENSENET_ALL"]]

# Set position of bar on X axis
r1 = np.arange(6)
r2 = [x + barWidth for x in r1]


plt.figure(figsize=(12,5))
plt.bar(r1, bar1, color='#003f5c', width=barWidth, edgecolor='white', label = 'Initial models')
plt.bar(r2, bar2, color='#ffa600', width=barWidth, edgecolor='white', label = 'Optimized models')
plt.xticks([r + (barWidth/2) for r in range(6)], model_names)
plt.ylim(bottom=0.1)
plt.legend(loc='best')
plt.title("Comparison between non-optimized and optimized models")
plt.ylabel("Classification Loss")
plt.grid(axis="y", linestyle="--")
plt.show()

Παρατηρούμε πως όλα τα βελτιστοποιημένα μοντέλα εμφανίζουν μικρότερο loss ως προς τα αρχικά δεδομένα. Αυτό, όπως αναφέρθηκε και πριν, είναι αναμενόμενο, εφόσον με χρήση Early Stopping διακόπτουμε τη διαδικασία της εκπαίδευσης όταν δεν υπάρχει βελτίωση ως προς το validation loss για πάνω από 20 εποχές και κρατάμε το μοντέλο εκείνο με το μικρότερο σφάλμα. Εδώ αξίζει να σημειωθεί πως σημαντική βελτίωση εμφανίζουν τα from scratch μοντέλα καθώς και το VGG16.

Επίδραση της απόδοσης με μεταβολή του αριθμού των κλάσεων

Αυξάνουμε διαδοχικά τον αριθμό των κλάσεων (και αντίστοιχα και τα δεδομένα μας) από 20 σε 40,60 και τέλος σε 80 ώστε να δούμε πως η αύξηση αυτή επηρεάζει την ακρίβεια των βελτιστοποιημένων μοντέλων μας (test accuracy). Να σημειωθεί πως όλα τα υπόλοιπα μεγέθη διατηρούνται σταθερά.

Αρχικά ορίζουμε το λεξικό fit_times το οποίο περιέχει τους χρόνους εκπαίδευσης όλων των μοντέλων όταν ο αριθμός των κλάσεων ισούται με 80.

In [ ]:
fit_times = {}

Στο σημείο αυτό επαναορίζουμε την συνάρτηση train_model ώστε να επιστρέφει εκτός από το history και τον χρόνο εκπαίδευσης.

In [ ]:
def train_model(model, train_dataset = train_ds, validation_dataset = validation_ds, epochs = 100, callbacks = None, steps_per_epoch = int(np.ceil(x_train.shape[0]/BATCH_SIZE)), validation_steps = int(np.ceil(x_val.shape[0]/BATCH_SIZE))):
    start_time = time.time()
    history = model.fit(train_dataset, epochs=epochs, steps_per_epoch=steps_per_epoch, validation_data=validation_dataset, validation_steps=validation_steps, callbacks=callbacks)
    fit_time = time.time() - start_time
    return history, fit_time

Ακόμα ορίζουμε την συνάρτηση fit_and_test_model την οποία χρησιμοποιούμε στην ενότητα αυτή για να αποφύγουμε την επανάληψη του ίδιου τμήματος κώδικα. Αυτή δημιουργεί το νέο dataset με βάση τον αριθμό των κλάσεων που δέχεται σαν όρισμα και στη συνέχεια εκπαιδεύει και εξετάζει το μοντέλο. Στην περίπτωση που ο αριθμός των κλάσεων ισούται με 80 συμπληρώνει κατάλληλα το λεξικό fit_times.

In [ ]:
def fit_and_test_model(number_of_classes, optimized_model, model_name):
    
    # select the number of classes
    cifar100_classes_url = select_classes_number(number_of_classes)

    team_classes = pd.read_csv(cifar100_classes_url, sep=',', header=None)
    CIFAR100_LABELS_LIST = pd.read_csv('https://pastebin.com/raw/qgDaNggt', sep=',', header=None).astype(str).values.tolist()[0]

    our_index = team_classes.iloc[team_seed,:].values.tolist()
    our_classes = select_from_list(CIFAR100_LABELS_LIST, our_index)
    train_index = get_ds_index(y_train_all, our_index)
    test_index = get_ds_index(y_test_all, our_index)

    x_train_ds = np.asarray(select_from_list(x_train_all, train_index))
    y_train_ds = np.asarray(select_from_list(y_train_all, train_index))
    x_test_ds = np.asarray(select_from_list(x_test_all, test_index))
    y_test_ds = np.asarray(select_from_list(y_test_all, test_index))

    # get (train) dataset dimensions
    data_size, img_rows, img_cols, img_channels = x_train_ds.shape

    # set validation set percentage (wrt the training set size)
    validation_percentage = 0.15
    val_size = round(validation_percentage * data_size)

    # Reserve val_size samples for validation and normalize all values
    x_val = x_train_ds[-val_size:]/255
    y_val = y_train_ds[-val_size:]
    x_train = x_train_ds[:-val_size]/255
    y_train = y_train_ds[:-val_size]
    x_test = x_test_ds/255
    y_test = y_test_ds

    y_train = create_new_labels(our_index,y_train)
    y_val = create_new_labels(our_index,y_val)
    y_test = create_new_labels(our_index,y_test)

    train_ds =_input_fn(x_train,y_train, BATCH_SIZE) #PrefetchDataset object
    validation_ds =_input_fn(x_val,y_val, BATCH_SIZE) #PrefetchDataset object
    test_ds =_input_fn(x_test,y_test, BATCH_SIZE) #PrefetchDataset object

    train_ds_res = train_ds.map(resize_transform)
    validation_ds_res = validation_ds.map(resize_transform)
    test_ds_res = test_ds.map(resize_transform)

    epoch_steps = int(np.ceil(x_train.shape[0]/BATCH_SIZE))
    val_steps = int(np.ceil(x_val.shape[0]/BATCH_SIZE))
    eval_steps = int(np.ceil(x_test.shape[0]/BATCH_SIZE))

    if model_name == "MobileNet":
       optimized_model_history, fit_time = train_model(optimized_model, train_dataset = train_ds_res, validation_dataset = validation_ds_res, epochs = 200, callbacks = [callback], steps_per_epoch = epoch_steps, validation_steps = val_steps)
       _, accuracy = model_report(optimized_model, optimized_model_history, evaluation_dataset = test_ds_res, evaluation_steps = eval_steps)
    else:
       optimized_model_history, fit_time = train_model(optimized_model, train_dataset = train_ds, validation_dataset = validation_ds, epochs = 200, callbacks = [callback], steps_per_epoch = epoch_steps, validation_steps = val_steps)
       _, accuracy = model_report(optimized_model, optimized_model_history, evaluation_dataset = test_ds, evaluation_steps = eval_steps)

    if number_of_classes == 80:
       fit_times[model_name] = fit_time
       
    return accuracy

Αριθμός κλάσεων = 40

Δίκτυα "from scratch"

In [ ]:
# Number of classes
number_of_classes = 40

accuracies_opt_40 = {}
Simple CNN
In [ ]:
SIMPLE_MODEL_OPTIMIZED = init_simple_model_optimized(summary = True, classes_num = number_of_classes)
accuracies_opt_40["SIMPLE_MODEL"] = fit_and_test_model(number_of_classes, SIMPLE_MODEL_OPTIMIZED, "Simple Model")
Model: "sequential"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
conv2d (Conv2D)              (None, 30, 30, 32)        896       
_________________________________________________________________
batch_normalization (BatchNo (None, 30, 30, 32)        128       
_________________________________________________________________
re_lu (ReLU)                 (None, 30, 30, 32)        0         
_________________________________________________________________
max_pooling2d (MaxPooling2D) (None, 15, 15, 32)        0         
_________________________________________________________________
dropout (Dropout)            (None, 15, 15, 32)        0         
_________________________________________________________________
conv2d_1 (Conv2D)            (None, 13, 13, 64)        18496     
_________________________________________________________________
batch_normalization_1 (Batch (None, 13, 13, 64)        256       
_________________________________________________________________
re_lu_1 (ReLU)               (None, 13, 13, 64)        0         
_________________________________________________________________
max_pooling2d_1 (MaxPooling2 (None, 6, 6, 64)          0         
_________________________________________________________________
dropout_1 (Dropout)          (None, 6, 6, 64)          0         
_________________________________________________________________
conv2d_2 (Conv2D)            (None, 4, 4, 64)          36928     
_________________________________________________________________
batch_normalization_2 (Batch (None, 4, 4, 64)          256       
_________________________________________________________________
re_lu_2 (ReLU)               (None, 4, 4, 64)          0         
_________________________________________________________________
flatten (Flatten)            (None, 1024)              0         
_________________________________________________________________
dropout_2 (Dropout)          (None, 1024)              0         
_________________________________________________________________
dense (Dense)                (None, 64)                65600     
_________________________________________________________________
dense_1 (Dense)              (None, 40)                2600      
=================================================================
Total params: 125,160
Trainable params: 124,840
Non-trainable params: 320
_________________________________________________________________
Epoch 1/200
532/532 [==============================] - 5s 5ms/step - loss: 4.8008 - accuracy: 0.0496 - val_loss: 4.4343 - val_accuracy: 0.0931
Epoch 2/200
532/532 [==============================] - 2s 4ms/step - loss: 4.2571 - accuracy: 0.1248 - val_loss: 3.9665 - val_accuracy: 0.1649
Epoch 3/200
532/532 [==============================] - 2s 4ms/step - loss: 3.8857 - accuracy: 0.1724 - val_loss: 3.6635 - val_accuracy: 0.2064
Epoch 4/200
532/532 [==============================] - 2s 4ms/step - loss: 3.6022 - accuracy: 0.2118 - val_loss: 3.4469 - val_accuracy: 0.2334
Epoch 5/200
532/532 [==============================] - 2s 4ms/step - loss: 3.3563 - accuracy: 0.2559 - val_loss: 3.2127 - val_accuracy: 0.2739
Epoch 6/200
532/532 [==============================] - 2s 4ms/step - loss: 3.1602 - accuracy: 0.2845 - val_loss: 3.0599 - val_accuracy: 0.3005
Epoch 7/200
532/532 [==============================] - 2s 4ms/step - loss: 3.0216 - accuracy: 0.3012 - val_loss: 2.9302 - val_accuracy: 0.3152
Epoch 8/200
532/532 [==============================] - 2s 4ms/step - loss: 2.8784 - accuracy: 0.3280 - val_loss: 2.8415 - val_accuracy: 0.3318
Epoch 9/200
532/532 [==============================] - 2s 4ms/step - loss: 2.7623 - accuracy: 0.3490 - val_loss: 3.0020 - val_accuracy: 0.2902
Epoch 10/200
532/532 [==============================] - 2s 4ms/step - loss: 2.6741 - accuracy: 0.3597 - val_loss: 2.8380 - val_accuracy: 0.3198
Epoch 11/200
532/532 [==============================] - 2s 4ms/step - loss: 2.5877 - accuracy: 0.3740 - val_loss: 2.7366 - val_accuracy: 0.3421
Epoch 12/200
532/532 [==============================] - 2s 4ms/step - loss: 2.5097 - accuracy: 0.3893 - val_loss: 2.6053 - val_accuracy: 0.3660
Epoch 13/200
532/532 [==============================] - 2s 4ms/step - loss: 2.4113 - accuracy: 0.4074 - val_loss: 2.8101 - val_accuracy: 0.3168
Epoch 14/200
532/532 [==============================] - 2s 4ms/step - loss: 2.3674 - accuracy: 0.4094 - val_loss: 2.5587 - val_accuracy: 0.3813
Epoch 15/200
532/532 [==============================] - 2s 4ms/step - loss: 2.3072 - accuracy: 0.4203 - val_loss: 2.4999 - val_accuracy: 0.3797
Epoch 16/200
532/532 [==============================] - 2s 4ms/step - loss: 2.2531 - accuracy: 0.4306 - val_loss: 2.5169 - val_accuracy: 0.3787
Epoch 17/200
532/532 [==============================] - 2s 4ms/step - loss: 2.2291 - accuracy: 0.4351 - val_loss: 2.5227 - val_accuracy: 0.3743
Epoch 18/200
532/532 [==============================] - 2s 4ms/step - loss: 2.1641 - accuracy: 0.4496 - val_loss: 2.3395 - val_accuracy: 0.4146
Epoch 19/200
532/532 [==============================] - 2s 4ms/step - loss: 2.1118 - accuracy: 0.4609 - val_loss: 2.2917 - val_accuracy: 0.4225
Epoch 20/200
532/532 [==============================] - 2s 4ms/step - loss: 2.0765 - accuracy: 0.4672 - val_loss: 2.2211 - val_accuracy: 0.4285
Epoch 21/200
532/532 [==============================] - 2s 4ms/step - loss: 2.0484 - accuracy: 0.4643 - val_loss: 2.1924 - val_accuracy: 0.4425
Epoch 22/200
532/532 [==============================] - 2s 4ms/step - loss: 2.0074 - accuracy: 0.4813 - val_loss: 2.2271 - val_accuracy: 0.4292
Epoch 23/200
532/532 [==============================] - 2s 4ms/step - loss: 1.9796 - accuracy: 0.4847 - val_loss: 2.2591 - val_accuracy: 0.4232
Epoch 24/200
532/532 [==============================] - 2s 4ms/step - loss: 1.9280 - accuracy: 0.4959 - val_loss: 2.3255 - val_accuracy: 0.4166
Epoch 25/200
532/532 [==============================] - 2s 4ms/step - loss: 1.8929 - accuracy: 0.5045 - val_loss: 2.1041 - val_accuracy: 0.4598
Epoch 26/200
532/532 [==============================] - 2s 4ms/step - loss: 1.8690 - accuracy: 0.5092 - val_loss: 2.0978 - val_accuracy: 0.4531
Epoch 27/200
532/532 [==============================] - 2s 4ms/step - loss: 1.8416 - accuracy: 0.5119 - val_loss: 2.3609 - val_accuracy: 0.4029
Epoch 28/200
532/532 [==============================] - 2s 4ms/step - loss: 1.8329 - accuracy: 0.5148 - val_loss: 2.0559 - val_accuracy: 0.4721
Epoch 29/200
532/532 [==============================] - 2s 4ms/step - loss: 1.8024 - accuracy: 0.5227 - val_loss: 2.0383 - val_accuracy: 0.4727
Epoch 30/200
532/532 [==============================] - 2s 4ms/step - loss: 1.7704 - accuracy: 0.5267 - val_loss: 2.1077 - val_accuracy: 0.4448
Epoch 31/200
532/532 [==============================] - 2s 4ms/step - loss: 1.7733 - accuracy: 0.5258 - val_loss: 1.9027 - val_accuracy: 0.5096
Epoch 32/200
532/532 [==============================] - 2s 4ms/step - loss: 1.7492 - accuracy: 0.5366 - val_loss: 1.8914 - val_accuracy: 0.5160
Epoch 33/200
532/532 [==============================] - 2s 4ms/step - loss: 1.7102 - accuracy: 0.5440 - val_loss: 2.0439 - val_accuracy: 0.4714
Epoch 34/200
532/532 [==============================] - 2s 4ms/step - loss: 1.6914 - accuracy: 0.5433 - val_loss: 1.8678 - val_accuracy: 0.5123
Epoch 35/200
532/532 [==============================] - 2s 4ms/step - loss: 1.6719 - accuracy: 0.5481 - val_loss: 1.8623 - val_accuracy: 0.5163
Epoch 36/200
532/532 [==============================] - 2s 4ms/step - loss: 1.6631 - accuracy: 0.5468 - val_loss: 1.8649 - val_accuracy: 0.5203
Epoch 37/200
532/532 [==============================] - 2s 4ms/step - loss: 1.6542 - accuracy: 0.5559 - val_loss: 1.9233 - val_accuracy: 0.4970
Epoch 38/200
532/532 [==============================] - 2s 4ms/step - loss: 1.6108 - accuracy: 0.5647 - val_loss: 2.0398 - val_accuracy: 0.4721
Epoch 39/200
532/532 [==============================] - 2s 4ms/step - loss: 1.6395 - accuracy: 0.5522 - val_loss: 1.9089 - val_accuracy: 0.5063
Epoch 40/200
532/532 [==============================] - 2s 4ms/step - loss: 1.5767 - accuracy: 0.5761 - val_loss: 1.8593 - val_accuracy: 0.5153
Epoch 41/200
532/532 [==============================] - 2s 4ms/step - loss: 1.5924 - accuracy: 0.5666 - val_loss: 2.0573 - val_accuracy: 0.4774
Epoch 42/200
532/532 [==============================] - 2s 4ms/step - loss: 1.5635 - accuracy: 0.5761 - val_loss: 1.8758 - val_accuracy: 0.5203
Epoch 43/200
532/532 [==============================] - 2s 4ms/step - loss: 1.5385 - accuracy: 0.5809 - val_loss: 1.9978 - val_accuracy: 0.4894
Epoch 44/200
532/532 [==============================] - 2s 4ms/step - loss: 1.5539 - accuracy: 0.5753 - val_loss: 1.8005 - val_accuracy: 0.5342
Epoch 45/200
532/532 [==============================] - 2s 4ms/step - loss: 1.5361 - accuracy: 0.5755 - val_loss: 1.8243 - val_accuracy: 0.5233
Epoch 46/200
532/532 [==============================] - 2s 4ms/step - loss: 1.5117 - accuracy: 0.5834 - val_loss: 1.8382 - val_accuracy: 0.5223
Epoch 47/200
532/532 [==============================] - 2s 4ms/step - loss: 1.5035 - accuracy: 0.5928 - val_loss: 1.7687 - val_accuracy: 0.5422
Epoch 48/200
532/532 [==============================] - 2s 4ms/step - loss: 1.4934 - accuracy: 0.5905 - val_loss: 1.7955 - val_accuracy: 0.5339
Epoch 49/200
532/532 [==============================] - 2s 4ms/step - loss: 1.4931 - accuracy: 0.5848 - val_loss: 1.8555 - val_accuracy: 0.5173
Epoch 50/200
532/532 [==============================] - 2s 4ms/step - loss: 1.4417 - accuracy: 0.6050 - val_loss: 1.7974 - val_accuracy: 0.5399
Epoch 51/200
532/532 [==============================] - 2s 4ms/step - loss: 1.4536 - accuracy: 0.6016 - val_loss: 1.9648 - val_accuracy: 0.4904
Epoch 52/200
532/532 [==============================] - 2s 4ms/step - loss: 1.4607 - accuracy: 0.6033 - val_loss: 1.8165 - val_accuracy: 0.5312
Epoch 53/200
532/532 [==============================] - 2s 4ms/step - loss: 1.4466 - accuracy: 0.6030 - val_loss: 1.9005 - val_accuracy: 0.5100
Epoch 54/200
532/532 [==============================] - 2s 4ms/step - loss: 1.4291 - accuracy: 0.6068 - val_loss: 1.9538 - val_accuracy: 0.4967
Epoch 55/200
532/532 [==============================] - 2s 4ms/step - loss: 1.4174 - accuracy: 0.6083 - val_loss: 1.7840 - val_accuracy: 0.5419
Epoch 56/200
532/532 [==============================] - 2s 4ms/step - loss: 1.4100 - accuracy: 0.6133 - val_loss: 1.7370 - val_accuracy: 0.5499
Epoch 57/200
532/532 [==============================] - 2s 4ms/step - loss: 1.3883 - accuracy: 0.6201 - val_loss: 1.7731 - val_accuracy: 0.5422
Epoch 58/200
532/532 [==============================] - 2s 4ms/step - loss: 1.4090 - accuracy: 0.6148 - val_loss: 1.7546 - val_accuracy: 0.5482
Epoch 59/200
532/532 [==============================] - 2s 4ms/step - loss: 1.4090 - accuracy: 0.6088 - val_loss: 2.0139 - val_accuracy: 0.5043
Epoch 60/200
532/532 [==============================] - 2s 4ms/step - loss: 1.3863 - accuracy: 0.6130 - val_loss: 1.7360 - val_accuracy: 0.5485
Epoch 61/200
532/532 [==============================] - 2s 4ms/step - loss: 1.3791 - accuracy: 0.6167 - val_loss: 1.7167 - val_accuracy: 0.5482
Epoch 62/200
532/532 [==============================] - 2s 4ms/step - loss: 1.3599 - accuracy: 0.6200 - val_loss: 1.9190 - val_accuracy: 0.5189
Epoch 63/200
532/532 [==============================] - 2s 4ms/step - loss: 1.3489 - accuracy: 0.6249 - val_loss: 1.7646 - val_accuracy: 0.5485
Epoch 64/200
532/532 [==============================] - 2s 4ms/step - loss: 1.3489 - accuracy: 0.6288 - val_loss: 1.7827 - val_accuracy: 0.5439
Epoch 65/200
532/532 [==============================] - 2s 4ms/step - loss: 1.3413 - accuracy: 0.6269 - val_loss: 1.7206 - val_accuracy: 0.5519
Epoch 66/200
532/532 [==============================] - 2s 4ms/step - loss: 1.3354 - accuracy: 0.6270 - val_loss: 1.7848 - val_accuracy: 0.5376
Epoch 67/200
532/532 [==============================] - 2s 4ms/step - loss: 1.3325 - accuracy: 0.6359 - val_loss: 1.7106 - val_accuracy: 0.5595
Epoch 68/200
532/532 [==============================] - 2s 4ms/step - loss: 1.3364 - accuracy: 0.6360 - val_loss: 1.7014 - val_accuracy: 0.5618
Epoch 69/200
532/532 [==============================] - 2s 4ms/step - loss: 1.3154 - accuracy: 0.6360 - val_loss: 1.8125 - val_accuracy: 0.5352
Epoch 70/200
532/532 [==============================] - 2s 4ms/step - loss: 1.3074 - accuracy: 0.6377 - val_loss: 1.7902 - val_accuracy: 0.5432
Epoch 71/200
532/532 [==============================] - 2s 4ms/step - loss: 1.3029 - accuracy: 0.6356 - val_loss: 1.7600 - val_accuracy: 0.5585
Epoch 72/200
532/532 [==============================] - 2s 4ms/step - loss: 1.3038 - accuracy: 0.6402 - val_loss: 1.6739 - val_accuracy: 0.5678
Epoch 73/200
532/532 [==============================] - 2s 4ms/step - loss: 1.2857 - accuracy: 0.6410 - val_loss: 1.7186 - val_accuracy: 0.5519
Epoch 74/200
532/532 [==============================] - 2s 4ms/step - loss: 1.2985 - accuracy: 0.6351 - val_loss: 1.7314 - val_accuracy: 0.5465
Epoch 75/200
532/532 [==============================] - 2s 4ms/step - loss: 1.2900 - accuracy: 0.6416 - val_loss: 1.7341 - val_accuracy: 0.5535
Epoch 76/200
532/532 [==============================] - 2s 4ms/step - loss: 1.2823 - accuracy: 0.6435 - val_loss: 1.7551 - val_accuracy: 0.5489
Epoch 77/200
532/532 [==============================] - 2s 4ms/step - loss: 1.2951 - accuracy: 0.6363 - val_loss: 1.7078 - val_accuracy: 0.5565
Epoch 78/200
532/532 [==============================] - 2s 4ms/step - loss: 1.2649 - accuracy: 0.6482 - val_loss: 1.7841 - val_accuracy: 0.5459
Epoch 79/200
532/532 [==============================] - 2s 4ms/step - loss: 1.2708 - accuracy: 0.6460 - val_loss: 1.8262 - val_accuracy: 0.5386
Epoch 80/200
532/532 [==============================] - 2s 4ms/step - loss: 1.2792 - accuracy: 0.6432 - val_loss: 1.7250 - val_accuracy: 0.5588
Epoch 81/200
532/532 [==============================] - 2s 4ms/step - loss: 1.2535 - accuracy: 0.6540 - val_loss: 1.6959 - val_accuracy: 0.5665
Epoch 82/200
532/532 [==============================] - 2s 4ms/step - loss: 1.2391 - accuracy: 0.6504 - val_loss: 1.7744 - val_accuracy: 0.5436
Epoch 83/200
532/532 [==============================] - 2s 4ms/step - loss: 1.2512 - accuracy: 0.6540 - val_loss: 1.7641 - val_accuracy: 0.5519
Epoch 84/200
532/532 [==============================] - 2s 4ms/step - loss: 1.2265 - accuracy: 0.6575 - val_loss: 1.7551 - val_accuracy: 0.5505
Epoch 85/200
532/532 [==============================] - 2s 4ms/step - loss: 1.2481 - accuracy: 0.6512 - val_loss: 1.7348 - val_accuracy: 0.5642
Epoch 86/200
532/532 [==============================] - 2s 4ms/step - loss: 1.2441 - accuracy: 0.6514 - val_loss: 1.7475 - val_accuracy: 0.5628
Epoch 87/200
532/532 [==============================] - 2s 4ms/step - loss: 1.2172 - accuracy: 0.6586 - val_loss: 1.7540 - val_accuracy: 0.5515
Epoch 88/200
532/532 [==============================] - 2s 4ms/step - loss: 1.2248 - accuracy: 0.6563 - val_loss: 1.7688 - val_accuracy: 0.5568
Epoch 89/200
532/532 [==============================] - 2s 5ms/step - loss: 1.2103 - accuracy: 0.6584 - val_loss: 1.7566 - val_accuracy: 0.5612
Epoch 90/200
532/532 [==============================] - 2s 4ms/step - loss: 1.1854 - accuracy: 0.6734 - val_loss: 1.8333 - val_accuracy: 0.5419
Epoch 91/200
532/532 [==============================] - 2s 4ms/step - loss: 1.1881 - accuracy: 0.6645 - val_loss: 1.7095 - val_accuracy: 0.5648
Epoch 92/200
532/532 [==============================] - 2s 4ms/step - loss: 1.2032 - accuracy: 0.6700 - val_loss: 1.7462 - val_accuracy: 0.5642
Test set evaluation metrics
---------------------------
Loss:     1.651
Accuracy: 56.250%
CNN1
In [ ]:
CNN1_MODEL_OPTIMIZED = init_cnn1_model_optimized(summary = True, classes_num = number_of_classes)
accuracies_opt_40["CNN1"] = fit_and_test_model(number_of_classes, CNN1_MODEL_OPTIMIZED, "Cnn1")
Model: "sequential_1"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
conv2d_3 (Conv2D)            (None, 30, 30, 32)        896       
_________________________________________________________________
batch_normalization_3 (Batch (None, 30, 30, 32)        128       
_________________________________________________________________
re_lu_3 (ReLU)               (None, 30, 30, 32)        0         
_________________________________________________________________
max_pooling2d_2 (MaxPooling2 (None, 15, 15, 32)        0         
_________________________________________________________________
dropout_3 (Dropout)          (None, 15, 15, 32)        0         
_________________________________________________________________
conv2d_4 (Conv2D)            (None, 13, 13, 64)        18496     
_________________________________________________________________
batch_normalization_4 (Batch (None, 13, 13, 64)        256       
_________________________________________________________________
re_lu_4 (ReLU)               (None, 13, 13, 64)        0         
_________________________________________________________________
max_pooling2d_3 (MaxPooling2 (None, 6, 6, 64)          0         
_________________________________________________________________
dropout_4 (Dropout)          (None, 6, 6, 64)          0         
_________________________________________________________________
conv2d_5 (Conv2D)            (None, 4, 4, 128)         73856     
_________________________________________________________________
batch_normalization_5 (Batch (None, 4, 4, 128)         512       
_________________________________________________________________
re_lu_5 (ReLU)               (None, 4, 4, 128)         0         
_________________________________________________________________
average_pooling2d (AveragePo (None, 2, 2, 128)         0         
_________________________________________________________________
dropout_5 (Dropout)          (None, 2, 2, 128)         0         
_________________________________________________________________
flatten_1 (Flatten)          (None, 512)               0         
_________________________________________________________________
dense_2 (Dense)              (None, 1024)              525312    
_________________________________________________________________
dropout_6 (Dropout)          (None, 1024)              0         
_________________________________________________________________
dense_3 (Dense)              (None, 40)                41000     
=================================================================
Total params: 660,456
Trainable params: 660,008
Non-trainable params: 448
_________________________________________________________________
Epoch 1/200
532/532 [==============================] - 3s 5ms/step - loss: 4.7322 - accuracy: 0.0917 - val_loss: 4.5110 - val_accuracy: 0.1140
Epoch 2/200
532/532 [==============================] - 2s 4ms/step - loss: 3.9170 - accuracy: 0.2093 - val_loss: 3.7245 - val_accuracy: 0.2224
Epoch 3/200
532/532 [==============================] - 2s 4ms/step - loss: 3.5234 - accuracy: 0.2630 - val_loss: 3.4237 - val_accuracy: 0.2666
Epoch 4/200
532/532 [==============================] - 2s 4ms/step - loss: 3.2249 - accuracy: 0.3000 - val_loss: 3.2007 - val_accuracy: 0.2859
Epoch 5/200
532/532 [==============================] - 2s 4ms/step - loss: 2.9960 - accuracy: 0.3315 - val_loss: 3.3682 - val_accuracy: 0.2457
Epoch 6/200
532/532 [==============================] - 2s 4ms/step - loss: 2.8343 - accuracy: 0.3472 - val_loss: 3.0018 - val_accuracy: 0.3135
Epoch 7/200
532/532 [==============================] - 2s 4ms/step - loss: 2.6452 - accuracy: 0.3924 - val_loss: 2.9027 - val_accuracy: 0.3288
Epoch 8/200
532/532 [==============================] - 2s 4ms/step - loss: 2.5677 - accuracy: 0.3916 - val_loss: 2.8157 - val_accuracy: 0.3428
Epoch 9/200
532/532 [==============================] - 2s 4ms/step - loss: 2.4559 - accuracy: 0.4130 - val_loss: 2.7061 - val_accuracy: 0.3511
Epoch 10/200
532/532 [==============================] - 2s 4ms/step - loss: 2.3758 - accuracy: 0.4218 - val_loss: 2.4176 - val_accuracy: 0.4116
Epoch 11/200
532/532 [==============================] - 2s 4ms/step - loss: 2.2843 - accuracy: 0.4448 - val_loss: 2.4649 - val_accuracy: 0.4009
Epoch 12/200
532/532 [==============================] - 2s 4ms/step - loss: 2.2202 - accuracy: 0.4553 - val_loss: 2.4164 - val_accuracy: 0.4212
Epoch 13/200
532/532 [==============================] - 2s 4ms/step - loss: 2.1375 - accuracy: 0.4645 - val_loss: 2.2704 - val_accuracy: 0.4395
Epoch 14/200
532/532 [==============================] - 2s 4ms/step - loss: 2.1007 - accuracy: 0.4747 - val_loss: 2.2221 - val_accuracy: 0.4481
Epoch 15/200
532/532 [==============================] - 2s 4ms/step - loss: 2.0395 - accuracy: 0.4837 - val_loss: 2.2038 - val_accuracy: 0.4488
Epoch 16/200
532/532 [==============================] - 2s 4ms/step - loss: 2.0037 - accuracy: 0.4885 - val_loss: 2.1236 - val_accuracy: 0.4668
Epoch 17/200
532/532 [==============================] - 2s 4ms/step - loss: 1.9530 - accuracy: 0.4966 - val_loss: 2.1134 - val_accuracy: 0.4664
Epoch 18/200
532/532 [==============================] - 2s 4ms/step - loss: 1.9286 - accuracy: 0.4972 - val_loss: 2.2814 - val_accuracy: 0.4312
Epoch 19/200
532/532 [==============================] - 2s 4ms/step - loss: 1.8686 - accuracy: 0.5139 - val_loss: 2.3210 - val_accuracy: 0.4255
Epoch 20/200
532/532 [==============================] - 2s 4ms/step - loss: 1.8496 - accuracy: 0.5223 - val_loss: 2.1259 - val_accuracy: 0.4668
Epoch 21/200
532/532 [==============================] - 2s 4ms/step - loss: 1.8125 - accuracy: 0.5276 - val_loss: 2.0210 - val_accuracy: 0.4957
Epoch 22/200
532/532 [==============================] - 2s 4ms/step - loss: 1.7917 - accuracy: 0.5223 - val_loss: 1.9614 - val_accuracy: 0.5023
Epoch 23/200
532/532 [==============================] - 2s 4ms/step - loss: 1.7331 - accuracy: 0.5437 - val_loss: 2.0044 - val_accuracy: 0.4917
Epoch 24/200
532/532 [==============================] - 2s 4ms/step - loss: 1.7115 - accuracy: 0.5473 - val_loss: 1.9358 - val_accuracy: 0.5093
Epoch 25/200
532/532 [==============================] - 2s 4ms/step - loss: 1.6999 - accuracy: 0.5477 - val_loss: 1.9429 - val_accuracy: 0.4983
Epoch 26/200
532/532 [==============================] - 2s 4ms/step - loss: 1.6564 - accuracy: 0.5560 - val_loss: 1.8808 - val_accuracy: 0.5219
Epoch 27/200
532/532 [==============================] - 2s 4ms/step - loss: 1.6625 - accuracy: 0.5610 - val_loss: 1.7771 - val_accuracy: 0.5362
Epoch 28/200
532/532 [==============================] - 2s 4ms/step - loss: 1.6273 - accuracy: 0.5645 - val_loss: 2.0147 - val_accuracy: 0.4877
Epoch 29/200
532/532 [==============================] - 2s 4ms/step - loss: 1.6126 - accuracy: 0.5712 - val_loss: 1.7517 - val_accuracy: 0.5376
Epoch 30/200
532/532 [==============================] - 2s 4ms/step - loss: 1.5636 - accuracy: 0.5840 - val_loss: 1.9437 - val_accuracy: 0.4997
Epoch 31/200
532/532 [==============================] - 2s 4ms/step - loss: 1.5505 - accuracy: 0.5793 - val_loss: 1.7874 - val_accuracy: 0.5339
Epoch 32/200
532/532 [==============================] - 2s 4ms/step - loss: 1.5430 - accuracy: 0.5863 - val_loss: 1.7681 - val_accuracy: 0.5399
Epoch 33/200
532/532 [==============================] - 2s 4ms/step - loss: 1.5308 - accuracy: 0.5829 - val_loss: 1.8882 - val_accuracy: 0.5193
Epoch 34/200
532/532 [==============================] - 2s 4ms/step - loss: 1.5070 - accuracy: 0.5864 - val_loss: 1.8366 - val_accuracy: 0.5259
Epoch 35/200
532/532 [==============================] - 2s 4ms/step - loss: 1.4836 - accuracy: 0.5955 - val_loss: 1.7387 - val_accuracy: 0.5512
Epoch 36/200
532/532 [==============================] - 2s 4ms/step - loss: 1.4867 - accuracy: 0.5948 - val_loss: 1.8357 - val_accuracy: 0.5226
Epoch 37/200
532/532 [==============================] - 2s 4ms/step - loss: 1.4712 - accuracy: 0.5986 - val_loss: 1.7013 - val_accuracy: 0.5608
Epoch 38/200
532/532 [==============================] - 2s 4ms/step - loss: 1.4454 - accuracy: 0.6007 - val_loss: 1.7735 - val_accuracy: 0.5459
Epoch 39/200
532/532 [==============================] - 2s 4ms/step - loss: 1.4157 - accuracy: 0.6157 - val_loss: 1.6828 - val_accuracy: 0.5682
Epoch 40/200
532/532 [==============================] - 2s 4ms/step - loss: 1.3996 - accuracy: 0.6244 - val_loss: 1.7536 - val_accuracy: 0.5329
Epoch 41/200
532/532 [==============================] - 2s 4ms/step - loss: 1.3988 - accuracy: 0.6172 - val_loss: 1.7956 - val_accuracy: 0.5349
Epoch 42/200
532/532 [==============================] - 2s 4ms/step - loss: 1.3837 - accuracy: 0.6204 - val_loss: 1.7063 - val_accuracy: 0.5542
Epoch 43/200
532/532 [==============================] - 2s 4ms/step - loss: 1.3686 - accuracy: 0.6286 - val_loss: 1.6850 - val_accuracy: 0.5645
Epoch 44/200
532/532 [==============================] - 2s 4ms/step - loss: 1.3744 - accuracy: 0.6201 - val_loss: 1.8209 - val_accuracy: 0.5339
Epoch 45/200
532/532 [==============================] - 2s 4ms/step - loss: 1.3559 - accuracy: 0.6322 - val_loss: 1.7617 - val_accuracy: 0.5426
Epoch 46/200
532/532 [==============================] - 2s 4ms/step - loss: 1.3322 - accuracy: 0.6316 - val_loss: 1.8292 - val_accuracy: 0.5412
Epoch 47/200
532/532 [==============================] - 2s 4ms/step - loss: 1.3239 - accuracy: 0.6364 - val_loss: 1.6463 - val_accuracy: 0.5635
Epoch 48/200
532/532 [==============================] - 2s 4ms/step - loss: 1.2786 - accuracy: 0.6499 - val_loss: 1.7055 - val_accuracy: 0.5412
Epoch 49/200
532/532 [==============================] - 2s 4ms/step - loss: 1.3132 - accuracy: 0.6363 - val_loss: 1.6956 - val_accuracy: 0.5585
Epoch 50/200
532/532 [==============================] - 2s 4ms/step - loss: 1.3063 - accuracy: 0.6410 - val_loss: 1.6944 - val_accuracy: 0.5539
Epoch 51/200
532/532 [==============================] - 2s 4ms/step - loss: 1.2754 - accuracy: 0.6534 - val_loss: 1.6869 - val_accuracy: 0.5665
Epoch 52/200
532/532 [==============================] - 2s 4ms/step - loss: 1.2603 - accuracy: 0.6561 - val_loss: 1.9593 - val_accuracy: 0.5120
Epoch 53/200
532/532 [==============================] - 2s 4ms/step - loss: 1.2659 - accuracy: 0.6519 - val_loss: 1.7068 - val_accuracy: 0.5495
Epoch 54/200
532/532 [==============================] - 2s 4ms/step - loss: 1.2673 - accuracy: 0.6537 - val_loss: 1.6336 - val_accuracy: 0.5645
Epoch 55/200
532/532 [==============================] - 2s 4ms/step - loss: 1.2515 - accuracy: 0.6556 - val_loss: 1.6162 - val_accuracy: 0.5735
Epoch 56/200
532/532 [==============================] - 2s 4ms/step - loss: 1.2178 - accuracy: 0.6653 - val_loss: 1.5866 - val_accuracy: 0.5821
Epoch 57/200
532/532 [==============================] - 2s 4ms/step - loss: 1.2422 - accuracy: 0.6546 - val_loss: 1.6531 - val_accuracy: 0.5618
Epoch 58/200
532/532 [==============================] - 2s 4ms/step - loss: 1.2018 - accuracy: 0.6699 - val_loss: 1.6495 - val_accuracy: 0.5698
Epoch 59/200
532/532 [==============================] - 2s 4ms/step - loss: 1.2229 - accuracy: 0.6637 - val_loss: 1.6636 - val_accuracy: 0.5731
Epoch 60/200
532/532 [==============================] - 2s 4ms/step - loss: 1.1857 - accuracy: 0.6687 - val_loss: 1.6222 - val_accuracy: 0.5688
Epoch 61/200
532/532 [==============================] - 2s 4ms/step - loss: 1.1954 - accuracy: 0.6686 - val_loss: 1.5765 - val_accuracy: 0.5854
Epoch 62/200
532/532 [==============================] - 2s 4ms/step - loss: 1.1688 - accuracy: 0.6730 - val_loss: 1.8280 - val_accuracy: 0.5296
Epoch 63/200
532/532 [==============================] - 2s 4ms/step - loss: 1.1763 - accuracy: 0.6746 - val_loss: 1.6655 - val_accuracy: 0.5588
Epoch 64/200
532/532 [==============================] - 2s 4ms/step - loss: 1.1539 - accuracy: 0.6738 - val_loss: 1.6425 - val_accuracy: 0.5718
Epoch 65/200
532/532 [==============================] - 2s 4ms/step - loss: 1.1494 - accuracy: 0.6775 - val_loss: 1.6810 - val_accuracy: 0.5595
Epoch 66/200
532/532 [==============================] - 2s 4ms/step - loss: 1.1350 - accuracy: 0.6863 - val_loss: 1.6867 - val_accuracy: 0.5685
Epoch 67/200
532/532 [==============================] - 2s 4ms/step - loss: 1.1301 - accuracy: 0.6861 - val_loss: 1.8402 - val_accuracy: 0.5339
Epoch 68/200
532/532 [==============================] - 2s 4ms/step - loss: 1.1144 - accuracy: 0.6908 - val_loss: 1.7142 - val_accuracy: 0.5632
Epoch 69/200
532/532 [==============================] - 2s 4ms/step - loss: 1.1126 - accuracy: 0.6917 - val_loss: 1.6939 - val_accuracy: 0.5618
Epoch 70/200
532/532 [==============================] - 2s 4ms/step - loss: 1.0982 - accuracy: 0.6974 - val_loss: 1.6070 - val_accuracy: 0.5788
Epoch 71/200
532/532 [==============================] - 2s 4ms/step - loss: 1.0872 - accuracy: 0.7028 - val_loss: 1.6480 - val_accuracy: 0.5622
Epoch 72/200
532/532 [==============================] - 2s 4ms/step - loss: 1.0995 - accuracy: 0.7000 - val_loss: 1.5088 - val_accuracy: 0.6024
Epoch 73/200
532/532 [==============================] - 2s 4ms/step - loss: 1.0721 - accuracy: 0.7052 - val_loss: 1.6677 - val_accuracy: 0.5824
Epoch 74/200
532/532 [==============================] - 2s 4ms/step - loss: 1.0732 - accuracy: 0.7112 - val_loss: 1.5695 - val_accuracy: 0.5931
Epoch 75/200
532/532 [==============================] - 2s 4ms/step - loss: 1.0878 - accuracy: 0.7014 - val_loss: 1.5871 - val_accuracy: 0.5795
Epoch 76/200
532/532 [==============================] - 2s 4ms/step - loss: 1.0650 - accuracy: 0.7094 - val_loss: 1.5969 - val_accuracy: 0.5888
Epoch 77/200
532/532 [==============================] - 2s 4ms/step - loss: 1.0659 - accuracy: 0.6989 - val_loss: 1.5919 - val_accuracy: 0.5798
Epoch 78/200
532/532 [==============================] - 2s 4ms/step - loss: 1.0556 - accuracy: 0.7070 - val_loss: 1.5602 - val_accuracy: 0.5921
Epoch 79/200
532/532 [==============================] - 2s 4ms/step - loss: 1.0410 - accuracy: 0.7152 - val_loss: 1.6673 - val_accuracy: 0.5828
Epoch 80/200
532/532 [==============================] - 2s 4ms/step - loss: 1.0495 - accuracy: 0.7060 - val_loss: 1.5288 - val_accuracy: 0.5944
Epoch 81/200
532/532 [==============================] - 2s 4ms/step - loss: 1.0581 - accuracy: 0.7028 - val_loss: 1.6129 - val_accuracy: 0.5824
Epoch 82/200
532/532 [==============================] - 2s 4ms/step - loss: 1.0415 - accuracy: 0.7138 - val_loss: 1.5585 - val_accuracy: 0.5908
Epoch 83/200
532/532 [==============================] - 2s 4ms/step - loss: 1.0198 - accuracy: 0.7201 - val_loss: 1.6096 - val_accuracy: 0.5861
Epoch 84/200
532/532 [==============================] - 2s 4ms/step - loss: 1.0125 - accuracy: 0.7202 - val_loss: 1.5372 - val_accuracy: 0.6084
Epoch 85/200
532/532 [==============================] - 2s 4ms/step - loss: 1.0048 - accuracy: 0.7259 - val_loss: 1.5505 - val_accuracy: 0.5991
Epoch 86/200
532/532 [==============================] - 2s 4ms/step - loss: 1.0103 - accuracy: 0.7190 - val_loss: 1.5890 - val_accuracy: 0.5928
Epoch 87/200
532/532 [==============================] - 2s 4ms/step - loss: 1.0228 - accuracy: 0.7165 - val_loss: 1.5826 - val_accuracy: 0.5928
Epoch 88/200
532/532 [==============================] - 2s 4ms/step - loss: 0.9924 - accuracy: 0.7251 - val_loss: 1.7017 - val_accuracy: 0.5735
Epoch 89/200
532/532 [==============================] - 2s 4ms/step - loss: 0.9848 - accuracy: 0.7290 - val_loss: 1.6933 - val_accuracy: 0.5638
Epoch 90/200
532/532 [==============================] - 2s 4ms/step - loss: 0.9827 - accuracy: 0.7328 - val_loss: 1.5190 - val_accuracy: 0.6137
Epoch 91/200
532/532 [==============================] - 2s 4ms/step - loss: 0.9845 - accuracy: 0.7283 - val_loss: 1.7097 - val_accuracy: 0.5715
Epoch 92/200
532/532 [==============================] - 2s 4ms/step - loss: 0.9817 - accuracy: 0.7255 - val_loss: 1.7212 - val_accuracy: 0.5731
Test set evaluation metrics
---------------------------
Loss:     1.463
Accuracy: 61.775%
CNN2
In [ ]:
CNN2_MODEL_OPTIMIZED = init_cnn2_model_optimized(summary = True, classes_num = number_of_classes)
accuracies_opt_40["CNN2"] = fit_and_test_model(number_of_classes, CNN2_MODEL_OPTIMIZED, "Cnn2")
Model: "sequential_2"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
conv2d_6 (Conv2D)            (None, 32, 32, 32)        896       
_________________________________________________________________
batch_normalization_6 (Batch (None, 32, 32, 32)        128       
_________________________________________________________________
re_lu_6 (ReLU)               (None, 32, 32, 32)        0         
_________________________________________________________________
max_pooling2d_4 (MaxPooling2 (None, 16, 16, 32)        0         
_________________________________________________________________
dropout_7 (Dropout)          (None, 16, 16, 32)        0         
_________________________________________________________________
conv2d_7 (Conv2D)            (None, 16, 16, 64)        18496     
_________________________________________________________________
batch_normalization_7 (Batch (None, 16, 16, 64)        256       
_________________________________________________________________
re_lu_7 (ReLU)               (None, 16, 16, 64)        0         
_________________________________________________________________
max_pooling2d_5 (MaxPooling2 (None, 8, 8, 64)          0         
_________________________________________________________________
dropout_8 (Dropout)          (None, 8, 8, 64)          0         
_________________________________________________________________
conv2d_8 (Conv2D)            (None, 8, 8, 128)         73856     
_________________________________________________________________
batch_normalization_8 (Batch (None, 8, 8, 128)         512       
_________________________________________________________________
re_lu_8 (ReLU)               (None, 8, 8, 128)         0         
_________________________________________________________________
max_pooling2d_6 (MaxPooling2 (None, 4, 4, 128)         0         
_________________________________________________________________
dropout_9 (Dropout)          (None, 4, 4, 128)         0         
_________________________________________________________________
conv2d_9 (Conv2D)            (None, 4, 4, 256)         295168    
_________________________________________________________________
batch_normalization_9 (Batch (None, 4, 4, 256)         1024      
_________________________________________________________________
re_lu_9 (ReLU)               (None, 4, 4, 256)         0         
_________________________________________________________________
dropout_10 (Dropout)         (None, 4, 4, 256)         0         
_________________________________________________________________
flatten_2 (Flatten)          (None, 4096)              0         
_________________________________________________________________
dense_4 (Dense)              (None, 512)               2097664   
_________________________________________________________________
dropout_11 (Dropout)         (None, 512)               0         
_________________________________________________________________
dense_5 (Dense)              (None, 40)                20520     
=================================================================
Total params: 2,508,520
Trainable params: 2,507,560
Non-trainable params: 960
_________________________________________________________________
Epoch 1/200
532/532 [==============================] - 4s 6ms/step - loss: 6.5268 - accuracy: 0.0826 - val_loss: 6.1391 - val_accuracy: 0.0918
Epoch 2/200
532/532 [==============================] - 3s 5ms/step - loss: 5.4798 - accuracy: 0.1873 - val_loss: 5.3620 - val_accuracy: 0.1506
Epoch 3/200
532/532 [==============================] - 3s 5ms/step - loss: 4.8309 - accuracy: 0.2412 - val_loss: 4.7779 - val_accuracy: 0.2068
Epoch 4/200
532/532 [==============================] - 3s 5ms/step - loss: 4.3177 - accuracy: 0.2855 - val_loss: 4.9151 - val_accuracy: 0.1543
Epoch 5/200
532/532 [==============================] - 3s 5ms/step - loss: 3.8947 - accuracy: 0.3262 - val_loss: 4.4009 - val_accuracy: 0.2071
Epoch 6/200
532/532 [==============================] - 3s 5ms/step - loss: 3.5451 - accuracy: 0.3550 - val_loss: 3.9945 - val_accuracy: 0.2540
Epoch 7/200
532/532 [==============================] - 3s 5ms/step - loss: 3.2478 - accuracy: 0.3914 - val_loss: 3.5724 - val_accuracy: 0.3088
Epoch 8/200
532/532 [==============================] - 3s 5ms/step - loss: 3.0349 - accuracy: 0.4112 - val_loss: 3.5371 - val_accuracy: 0.2939
Epoch 9/200
532/532 [==============================] - 3s 5ms/step - loss: 2.8411 - accuracy: 0.4220 - val_loss: 3.3195 - val_accuracy: 0.3271
Epoch 10/200
532/532 [==============================] - 3s 5ms/step - loss: 2.6524 - accuracy: 0.4493 - val_loss: 3.1553 - val_accuracy: 0.3444
Epoch 11/200
532/532 [==============================] - 3s 5ms/step - loss: 2.5093 - accuracy: 0.4711 - val_loss: 2.6862 - val_accuracy: 0.4162
Epoch 12/200
532/532 [==============================] - 3s 5ms/step - loss: 2.3716 - accuracy: 0.4834 - val_loss: 2.7039 - val_accuracy: 0.4092
Epoch 13/200
532/532 [==============================] - 3s 5ms/step - loss: 2.2801 - accuracy: 0.4906 - val_loss: 2.8445 - val_accuracy: 0.3810
Epoch 14/200
532/532 [==============================] - 3s 5ms/step - loss: 2.1718 - accuracy: 0.5127 - val_loss: 2.4641 - val_accuracy: 0.4468
Epoch 15/200
532/532 [==============================] - 3s 5ms/step - loss: 2.0800 - accuracy: 0.5279 - val_loss: 2.3969 - val_accuracy: 0.4545
Epoch 16/200
532/532 [==============================] - 3s 5ms/step - loss: 1.9895 - accuracy: 0.5404 - val_loss: 2.2240 - val_accuracy: 0.4924
Epoch 17/200
532/532 [==============================] - 3s 5ms/step - loss: 1.9340 - accuracy: 0.5530 - val_loss: 2.2568 - val_accuracy: 0.4754
Epoch 18/200
532/532 [==============================] - 3s 5ms/step - loss: 1.8506 - accuracy: 0.5661 - val_loss: 2.1100 - val_accuracy: 0.5186
Epoch 19/200
532/532 [==============================] - 3s 5ms/step - loss: 1.7959 - accuracy: 0.5747 - val_loss: 2.4344 - val_accuracy: 0.4571
Epoch 20/200
532/532 [==============================] - 3s 5ms/step - loss: 1.7639 - accuracy: 0.5777 - val_loss: 2.0545 - val_accuracy: 0.5140
Epoch 21/200
532/532 [==============================] - 3s 5ms/step - loss: 1.7031 - accuracy: 0.5928 - val_loss: 1.9858 - val_accuracy: 0.5299
Epoch 22/200
532/532 [==============================] - 3s 5ms/step - loss: 1.6599 - accuracy: 0.5977 - val_loss: 1.9386 - val_accuracy: 0.5445
Epoch 23/200
532/532 [==============================] - 3s 5ms/step - loss: 1.6050 - accuracy: 0.6114 - val_loss: 1.9609 - val_accuracy: 0.5256
Epoch 24/200
532/532 [==============================] - 3s 5ms/step - loss: 1.5615 - accuracy: 0.6208 - val_loss: 2.0516 - val_accuracy: 0.5163
Epoch 25/200
532/532 [==============================] - 3s 5ms/step - loss: 1.5382 - accuracy: 0.6213 - val_loss: 1.9882 - val_accuracy: 0.5253
Epoch 26/200
532/532 [==============================] - 3s 5ms/step - loss: 1.4782 - accuracy: 0.6397 - val_loss: 1.9234 - val_accuracy: 0.5316
Epoch 27/200
532/532 [==============================] - 3s 5ms/step - loss: 1.4820 - accuracy: 0.6345 - val_loss: 1.9463 - val_accuracy: 0.5329
Epoch 28/200
532/532 [==============================] - 3s 5ms/step - loss: 1.4251 - accuracy: 0.6511 - val_loss: 1.9070 - val_accuracy: 0.5495
Epoch 29/200
532/532 [==============================] - 3s 5ms/step - loss: 1.3924 - accuracy: 0.6559 - val_loss: 1.9477 - val_accuracy: 0.5429
Epoch 30/200
532/532 [==============================] - 3s 5ms/step - loss: 1.3509 - accuracy: 0.6667 - val_loss: 1.9409 - val_accuracy: 0.5459
Epoch 31/200
532/532 [==============================] - 3s 5ms/step - loss: 1.3601 - accuracy: 0.6716 - val_loss: 1.8307 - val_accuracy: 0.5532
Epoch 32/200
532/532 [==============================] - 3s 5ms/step - loss: 1.2940 - accuracy: 0.6857 - val_loss: 1.8634 - val_accuracy: 0.5562
Epoch 33/200
532/532 [==============================] - 3s 5ms/step - loss: 1.2699 - accuracy: 0.6881 - val_loss: 1.9848 - val_accuracy: 0.5362
Epoch 34/200
532/532 [==============================] - 3s 5ms/step - loss: 1.2584 - accuracy: 0.6875 - val_loss: 2.0020 - val_accuracy: 0.5339
Epoch 35/200
532/532 [==============================] - 3s 5ms/step - loss: 1.2252 - accuracy: 0.6976 - val_loss: 1.7678 - val_accuracy: 0.5691
Epoch 36/200
532/532 [==============================] - 3s 5ms/step - loss: 1.1979 - accuracy: 0.7052 - val_loss: 1.8799 - val_accuracy: 0.5512
Epoch 37/200
532/532 [==============================] - 3s 5ms/step - loss: 1.1989 - accuracy: 0.7056 - val_loss: 1.9607 - val_accuracy: 0.5416
Epoch 38/200
532/532 [==============================] - 3s 5ms/step - loss: 1.1626 - accuracy: 0.7140 - val_loss: 1.7142 - val_accuracy: 0.5888
Epoch 39/200
532/532 [==============================] - 3s 5ms/step - loss: 1.1552 - accuracy: 0.7153 - val_loss: 1.8252 - val_accuracy: 0.5678
Epoch 40/200
532/532 [==============================] - 3s 5ms/step - loss: 1.1304 - accuracy: 0.7309 - val_loss: 1.8012 - val_accuracy: 0.5751
Epoch 41/200
532/532 [==============================] - 3s 5ms/step - loss: 1.1024 - accuracy: 0.7276 - val_loss: 2.0078 - val_accuracy: 0.5495
Epoch 42/200
532/532 [==============================] - 3s 5ms/step - loss: 1.0814 - accuracy: 0.7367 - val_loss: 1.7435 - val_accuracy: 0.5801
Epoch 43/200
532/532 [==============================] - 3s 5ms/step - loss: 1.0946 - accuracy: 0.7381 - val_loss: 1.7112 - val_accuracy: 0.5898
Epoch 44/200
532/532 [==============================] - 3s 5ms/step - loss: 1.0500 - accuracy: 0.7526 - val_loss: 1.7856 - val_accuracy: 0.5818
Epoch 45/200
532/532 [==============================] - 3s 5ms/step - loss: 1.0393 - accuracy: 0.7518 - val_loss: 1.7459 - val_accuracy: 0.5814
Epoch 46/200
532/532 [==============================] - 3s 5ms/step - loss: 1.0395 - accuracy: 0.7524 - val_loss: 1.7652 - val_accuracy: 0.5834
Epoch 47/200
532/532 [==============================] - 3s 5ms/step - loss: 1.0279 - accuracy: 0.7538 - val_loss: 1.7477 - val_accuracy: 0.5848
Epoch 48/200
532/532 [==============================] - 3s 5ms/step - loss: 0.9912 - accuracy: 0.7639 - val_loss: 1.9691 - val_accuracy: 0.5422
Epoch 49/200
532/532 [==============================] - 3s 5ms/step - loss: 0.9887 - accuracy: 0.7633 - val_loss: 1.9982 - val_accuracy: 0.5439
Epoch 50/200
532/532 [==============================] - 3s 5ms/step - loss: 0.9752 - accuracy: 0.7691 - val_loss: 1.8151 - val_accuracy: 0.5771
Epoch 51/200
532/532 [==============================] - 3s 5ms/step - loss: 0.9621 - accuracy: 0.7726 - val_loss: 1.7171 - val_accuracy: 0.5977
Epoch 52/200
532/532 [==============================] - 3s 5ms/step - loss: 0.9378 - accuracy: 0.7801 - val_loss: 1.7454 - val_accuracy: 0.5991
Epoch 53/200
532/532 [==============================] - 3s 5ms/step - loss: 0.9357 - accuracy: 0.7805 - val_loss: 1.6911 - val_accuracy: 0.5984
Epoch 54/200
532/532 [==============================] - 3s 5ms/step - loss: 0.9215 - accuracy: 0.7863 - val_loss: 1.8015 - val_accuracy: 0.5911
Epoch 55/200
532/532 [==============================] - 3s 5ms/step - loss: 0.8974 - accuracy: 0.7895 - val_loss: 1.7867 - val_accuracy: 0.5834
Epoch 56/200
532/532 [==============================] - 3s 5ms/step - loss: 0.9130 - accuracy: 0.7873 - val_loss: 1.7530 - val_accuracy: 0.5928
Epoch 57/200
532/532 [==============================] - 3s 5ms/step - loss: 0.8882 - accuracy: 0.7978 - val_loss: 1.7726 - val_accuracy: 0.5894
Epoch 58/200
532/532 [==============================] - 3s 5ms/step - loss: 0.8791 - accuracy: 0.7977 - val_loss: 1.7676 - val_accuracy: 0.5834
Epoch 59/200
532/532 [==============================] - 3s 5ms/step - loss: 0.8823 - accuracy: 0.7920 - val_loss: 1.7695 - val_accuracy: 0.5918
Epoch 60/200
532/532 [==============================] - 3s 5ms/step - loss: 0.8468 - accuracy: 0.8129 - val_loss: 1.7362 - val_accuracy: 0.6001
Epoch 61/200
532/532 [==============================] - 3s 5ms/step - loss: 0.8660 - accuracy: 0.8065 - val_loss: 1.8217 - val_accuracy: 0.5904
Epoch 62/200
532/532 [==============================] - 3s 5ms/step - loss: 0.8447 - accuracy: 0.8124 - val_loss: 1.7236 - val_accuracy: 0.5987
Epoch 63/200
532/532 [==============================] - 3s 5ms/step - loss: 0.8357 - accuracy: 0.8116 - val_loss: 1.7624 - val_accuracy: 0.6034
Epoch 64/200
532/532 [==============================] - 3s 5ms/step - loss: 0.8350 - accuracy: 0.8172 - val_loss: 1.8575 - val_accuracy: 0.5844
Epoch 65/200
532/532 [==============================] - 3s 5ms/step - loss: 0.8115 - accuracy: 0.8197 - val_loss: 1.8067 - val_accuracy: 0.5921
Epoch 66/200
532/532 [==============================] - 3s 5ms/step - loss: 0.8452 - accuracy: 0.8061 - val_loss: 1.7753 - val_accuracy: 0.5964
Epoch 67/200
532/532 [==============================] - 3s 5ms/step - loss: 0.7898 - accuracy: 0.8242 - val_loss: 1.7385 - val_accuracy: 0.6084
Epoch 68/200
532/532 [==============================] - 3s 5ms/step - loss: 0.7989 - accuracy: 0.8262 - val_loss: 1.9176 - val_accuracy: 0.5831
Epoch 69/200
532/532 [==============================] - 3s 5ms/step - loss: 0.7923 - accuracy: 0.8251 - val_loss: 1.8463 - val_accuracy: 0.5894
Epoch 70/200
532/532 [==============================] - 3s 5ms/step - loss: 0.7731 - accuracy: 0.8366 - val_loss: 1.8192 - val_accuracy: 0.5984
Epoch 71/200
532/532 [==============================] - 3s 5ms/step - loss: 0.7843 - accuracy: 0.8288 - val_loss: 1.7353 - val_accuracy: 0.6014
Epoch 72/200
532/532 [==============================] - 3s 5ms/step - loss: 0.7648 - accuracy: 0.8351 - val_loss: 1.7938 - val_accuracy: 0.5938
Epoch 73/200
532/532 [==============================] - 3s 5ms/step - loss: 0.7692 - accuracy: 0.8340 - val_loss: 1.7317 - val_accuracy: 0.6080
Test set evaluation metrics
---------------------------
Loss:     1.666
Accuracy: 60.900%

Μεταφορά μάθησης

VGG16
In [ ]:
VGG16_MODEL_OPTIMIZED = init_VGG16_model_optimized(True, classes_num = number_of_classes)
accuracies_opt_40["VGG_ALL"] = fit_and_test_model(number_of_classes, VGG16_MODEL_OPTIMIZED, "VGG16")
Model: "sequential"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
vgg16 (Functional)           (None, 1, 1, 512)         14714688  
_________________________________________________________________
dropout (Dropout)            (None, 1, 1, 512)         0         
_________________________________________________________________
global_average_pooling2d (Gl (None, 512)               0         
_________________________________________________________________
dense (Dense)                (None, 40)                20520     
=================================================================
Total params: 14,735,208
Trainable params: 14,735,208
Non-trainable params: 0
_________________________________________________________________
Epoch 1/200
532/532 [==============================] - 12s 19ms/step - loss: 3.6165 - accuracy: 0.0705 - val_loss: 2.7216 - val_accuracy: 0.2842
Epoch 2/200
532/532 [==============================] - 10s 19ms/step - loss: 2.5322 - accuracy: 0.3148 - val_loss: 1.8293 - val_accuracy: 0.4634
Epoch 3/200
532/532 [==============================] - 10s 19ms/step - loss: 1.7336 - accuracy: 0.4988 - val_loss: 1.4651 - val_accuracy: 0.5682
Epoch 4/200
532/532 [==============================] - 10s 18ms/step - loss: 1.3179 - accuracy: 0.6013 - val_loss: 1.3332 - val_accuracy: 0.6074
Epoch 5/200
532/532 [==============================] - 10s 19ms/step - loss: 1.0271 - accuracy: 0.6851 - val_loss: 1.2533 - val_accuracy: 0.6333
Epoch 6/200
532/532 [==============================] - 10s 18ms/step - loss: 0.8357 - accuracy: 0.7449 - val_loss: 1.2641 - val_accuracy: 0.6483
Epoch 7/200
532/532 [==============================] - 10s 19ms/step - loss: 0.6628 - accuracy: 0.7946 - val_loss: 1.2678 - val_accuracy: 0.6592
Epoch 8/200
532/532 [==============================] - 10s 19ms/step - loss: 0.4843 - accuracy: 0.8482 - val_loss: 1.3266 - val_accuracy: 0.6596
Epoch 9/200
532/532 [==============================] - 10s 18ms/step - loss: 0.3774 - accuracy: 0.8805 - val_loss: 1.3277 - val_accuracy: 0.6676
Epoch 10/200
532/532 [==============================] - 10s 18ms/step - loss: 0.2773 - accuracy: 0.9120 - val_loss: 1.3887 - val_accuracy: 0.6649
Epoch 11/200
532/532 [==============================] - 10s 18ms/step - loss: 0.2112 - accuracy: 0.9315 - val_loss: 1.6171 - val_accuracy: 0.6503
Epoch 12/200
532/532 [==============================] - 10s 19ms/step - loss: 0.1834 - accuracy: 0.9412 - val_loss: 1.5746 - val_accuracy: 0.6602
Epoch 13/200
532/532 [==============================] - 10s 18ms/step - loss: 0.1251 - accuracy: 0.9594 - val_loss: 1.5805 - val_accuracy: 0.6619
Epoch 14/200
532/532 [==============================] - 10s 19ms/step - loss: 0.1200 - accuracy: 0.9647 - val_loss: 1.7161 - val_accuracy: 0.6539
Epoch 15/200
532/532 [==============================] - 10s 18ms/step - loss: 0.0933 - accuracy: 0.9713 - val_loss: 1.7046 - val_accuracy: 0.6612
Epoch 16/200
532/532 [==============================] - 10s 19ms/step - loss: 0.0808 - accuracy: 0.9751 - val_loss: 1.8525 - val_accuracy: 0.6373
Epoch 17/200
532/532 [==============================] - 10s 18ms/step - loss: 0.0991 - accuracy: 0.9706 - val_loss: 1.8290 - val_accuracy: 0.6546
Epoch 18/200
532/532 [==============================] - 10s 18ms/step - loss: 0.0523 - accuracy: 0.9831 - val_loss: 1.9226 - val_accuracy: 0.6516
Epoch 19/200
532/532 [==============================] - 10s 19ms/step - loss: 0.0815 - accuracy: 0.9737 - val_loss: 1.7613 - val_accuracy: 0.6533
Epoch 20/200
532/532 [==============================] - 10s 19ms/step - loss: 0.0631 - accuracy: 0.9803 - val_loss: 1.9627 - val_accuracy: 0.6582
Epoch 21/200
532/532 [==============================] - 10s 18ms/step - loss: 0.0678 - accuracy: 0.9800 - val_loss: 1.8687 - val_accuracy: 0.6489
Epoch 22/200
532/532 [==============================] - 10s 18ms/step - loss: 0.0638 - accuracy: 0.9824 - val_loss: 1.7940 - val_accuracy: 0.6652
Epoch 23/200
532/532 [==============================] - 10s 19ms/step - loss: 0.0552 - accuracy: 0.9840 - val_loss: 1.9269 - val_accuracy: 0.6536
Epoch 24/200
532/532 [==============================] - 10s 18ms/step - loss: 0.0568 - accuracy: 0.9823 - val_loss: 1.9381 - val_accuracy: 0.6479
Epoch 25/200
532/532 [==============================] - 10s 18ms/step - loss: 0.0444 - accuracy: 0.9851 - val_loss: 2.0011 - val_accuracy: 0.6652
Test set evaluation metrics
---------------------------
Loss:     1.210
Accuracy: 63.825%
MobileNet
In [ ]:
MobileNetV2_MODEL_OPTIMIZED = init_MobileNetV2_model_optimized(True, classes_num = number_of_classes)
accuracies_opt_40["MOBILENET_ALL"] = fit_and_test_model(number_of_classes, MobileNetV2_MODEL_OPTIMIZED, "MobileNet")
Downloading data from https://storage.googleapis.com/tensorflow/keras-applications/mobilenet_v2/mobilenet_v2_weights_tf_dim_ordering_tf_kernels_1.0_224_no_top.h5
9412608/9406464 [==============================] - 0s 0us/step
Model: "sequential_4"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
mobilenetv2_1.00_224 (Functi (None, 7, 7, 1280)        2257984   
_________________________________________________________________
dropout_13 (Dropout)         (None, 7, 7, 1280)        0         
_________________________________________________________________
global_average_pooling2d_1 ( (None, 1280)              0         
_________________________________________________________________
dense_7 (Dense)              (None, 40)                51240     
=================================================================
Total params: 2,309,224
Trainable params: 2,275,112
Non-trainable params: 34,112
_________________________________________________________________
Epoch 1/200
532/532 [==============================] - 123s 223ms/step - loss: 2.1427 - accuracy: 0.4335 - val_loss: 4.0282 - val_accuracy: 0.1835
Epoch 2/200
532/532 [==============================] - 117s 220ms/step - loss: 0.5977 - accuracy: 0.8143 - val_loss: 3.4124 - val_accuracy: 0.1971
Epoch 3/200
532/532 [==============================] - 114s 215ms/step - loss: 0.3068 - accuracy: 0.9092 - val_loss: 3.5062 - val_accuracy: 0.2028
Epoch 4/200
532/532 [==============================] - 117s 221ms/step - loss: 0.1708 - accuracy: 0.9513 - val_loss: 3.2548 - val_accuracy: 0.2311
Epoch 5/200
532/532 [==============================] - 117s 220ms/step - loss: 0.0984 - accuracy: 0.9745 - val_loss: 1.8619 - val_accuracy: 0.5123
Epoch 6/200
532/532 [==============================] - 118s 222ms/step - loss: 0.0704 - accuracy: 0.9815 - val_loss: 1.3286 - val_accuracy: 0.6546
Epoch 7/200
532/532 [==============================] - 118s 221ms/step - loss: 0.0525 - accuracy: 0.9868 - val_loss: 0.8764 - val_accuracy: 0.7630
Epoch 8/200
532/532 [==============================] - 117s 219ms/step - loss: 0.0441 - accuracy: 0.9898 - val_loss: 1.1689 - val_accuracy: 0.7224
Epoch 9/200
532/532 [==============================] - 117s 219ms/step - loss: 0.0473 - accuracy: 0.9879 - val_loss: 1.1777 - val_accuracy: 0.7231
Epoch 10/200
532/532 [==============================] - 118s 221ms/step - loss: 0.0607 - accuracy: 0.9816 - val_loss: 1.1373 - val_accuracy: 0.7370
Epoch 11/200
532/532 [==============================] - 117s 221ms/step - loss: 0.0456 - accuracy: 0.9855 - val_loss: 1.0936 - val_accuracy: 0.7473
Epoch 12/200
532/532 [==============================] - 118s 222ms/step - loss: 0.0422 - accuracy: 0.9882 - val_loss: 0.9974 - val_accuracy: 0.7842
Epoch 13/200
532/532 [==============================] - 119s 225ms/step - loss: 0.0368 - accuracy: 0.9892 - val_loss: 1.2590 - val_accuracy: 0.7460
Epoch 14/200
532/532 [==============================] - 120s 226ms/step - loss: 0.0408 - accuracy: 0.9878 - val_loss: 1.3810 - val_accuracy: 0.7281
Epoch 15/200
532/532 [==============================] - 120s 226ms/step - loss: 0.0349 - accuracy: 0.9895 - val_loss: 1.2372 - val_accuracy: 0.7344
Epoch 16/200
532/532 [==============================] - 120s 226ms/step - loss: 0.0406 - accuracy: 0.9871 - val_loss: 1.1642 - val_accuracy: 0.7430
Epoch 17/200
532/532 [==============================] - 119s 224ms/step - loss: 0.0403 - accuracy: 0.9865 - val_loss: 1.1213 - val_accuracy: 0.7553
Epoch 18/200
532/532 [==============================] - 119s 223ms/step - loss: 0.0264 - accuracy: 0.9922 - val_loss: 1.2394 - val_accuracy: 0.7626
Epoch 19/200
532/532 [==============================] - 117s 220ms/step - loss: 0.0296 - accuracy: 0.9904 - val_loss: 1.1122 - val_accuracy: 0.7862
Epoch 20/200
532/532 [==============================] - 118s 222ms/step - loss: 0.0304 - accuracy: 0.9910 - val_loss: 1.2227 - val_accuracy: 0.7583
Epoch 21/200
532/532 [==============================] - 118s 222ms/step - loss: 0.0257 - accuracy: 0.9922 - val_loss: 1.2726 - val_accuracy: 0.7447
Epoch 22/200
532/532 [==============================] - 119s 224ms/step - loss: 0.0276 - accuracy: 0.9909 - val_loss: 1.2061 - val_accuracy: 0.7513
Epoch 23/200
532/532 [==============================] - 119s 224ms/step - loss: 0.0341 - accuracy: 0.9885 - val_loss: 1.2089 - val_accuracy: 0.7733
Epoch 24/200
532/532 [==============================] - 120s 226ms/step - loss: 0.0307 - accuracy: 0.9898 - val_loss: 1.3505 - val_accuracy: 0.7606
Epoch 25/200
532/532 [==============================] - 117s 219ms/step - loss: 0.0370 - accuracy: 0.9879 - val_loss: 1.5274 - val_accuracy: 0.7450
Epoch 26/200
532/532 [==============================] - 119s 223ms/step - loss: 0.0260 - accuracy: 0.9911 - val_loss: 1.3952 - val_accuracy: 0.7400
Epoch 27/200
532/532 [==============================] - 119s 223ms/step - loss: 0.0298 - accuracy: 0.9907 - val_loss: 1.2580 - val_accuracy: 0.7753
Test set evaluation metrics
---------------------------
Loss:     0.827
Accuracy: 77.800%
DenseNet
In [ ]:
DENSENET_MODEL_OPTIMIZED = init_DENSENET_model_optimized(True, classes_num = number_of_classes)
accuracies_opt_40["DENSENET_ALL"] = fit_and_test_model(number_of_classes, DENSENET_MODEL_OPTIMIZED, "DenseNet")
Model: "sequential_5"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
densenet121 (Functional)     (None, 1, 1, 1024)        7037504   
_________________________________________________________________
dropout_14 (Dropout)         (None, 1, 1, 1024)        0         
_________________________________________________________________
global_average_pooling2d_2 ( (None, 1024)              0         
_________________________________________________________________
dense_8 (Dense)              (None, 40)                41000     
=================================================================
Total params: 7,078,504
Trainable params: 6,994,856
Non-trainable params: 83,648
_________________________________________________________________
Epoch 1/200
532/532 [==============================] - 28s 34ms/step - loss: 4.0595 - accuracy: 0.1080 - val_loss: 2.4473 - val_accuracy: 0.4102
Epoch 2/200
532/532 [==============================] - 17s 31ms/step - loss: 2.2613 - accuracy: 0.3906 - val_loss: 1.6684 - val_accuracy: 0.5499
Epoch 3/200
532/532 [==============================] - 17s 31ms/step - loss: 1.7273 - accuracy: 0.5163 - val_loss: 1.4684 - val_accuracy: 0.5904
Epoch 4/200
532/532 [==============================] - 17s 31ms/step - loss: 1.3993 - accuracy: 0.5998 - val_loss: 1.2696 - val_accuracy: 0.6363
Epoch 5/200
532/532 [==============================] - 17s 31ms/step - loss: 1.1156 - accuracy: 0.6701 - val_loss: 1.3461 - val_accuracy: 0.6293
Epoch 6/200
532/532 [==============================] - 17s 31ms/step - loss: 0.9316 - accuracy: 0.7230 - val_loss: 1.2708 - val_accuracy: 0.6396
Epoch 7/200
532/532 [==============================] - 16s 31ms/step - loss: 0.7718 - accuracy: 0.7623 - val_loss: 1.2405 - val_accuracy: 0.6456
Epoch 8/200
532/532 [==============================] - 17s 31ms/step - loss: 0.6358 - accuracy: 0.8056 - val_loss: 1.2208 - val_accuracy: 0.6672
Epoch 9/200
532/532 [==============================] - 17s 32ms/step - loss: 0.5359 - accuracy: 0.8347 - val_loss: 1.2191 - val_accuracy: 0.6715
Epoch 10/200
532/532 [==============================] - 17s 32ms/step - loss: 0.4333 - accuracy: 0.8673 - val_loss: 1.3529 - val_accuracy: 0.6543
Epoch 11/200
532/532 [==============================] - 17s 32ms/step - loss: 0.3632 - accuracy: 0.8848 - val_loss: 1.2985 - val_accuracy: 0.6755
Epoch 12/200
532/532 [==============================] - 17s 31ms/step - loss: 0.3141 - accuracy: 0.9038 - val_loss: 1.3618 - val_accuracy: 0.6719
Epoch 13/200
532/532 [==============================] - 17s 32ms/step - loss: 0.2726 - accuracy: 0.9164 - val_loss: 1.4155 - val_accuracy: 0.6666
Epoch 14/200
532/532 [==============================] - 17s 32ms/step - loss: 0.2436 - accuracy: 0.9249 - val_loss: 1.4338 - val_accuracy: 0.6755
Epoch 15/200
532/532 [==============================] - 17s 31ms/step - loss: 0.2198 - accuracy: 0.9316 - val_loss: 1.4318 - val_accuracy: 0.6616
Epoch 16/200
532/532 [==============================] - 16s 31ms/step - loss: 0.1925 - accuracy: 0.9403 - val_loss: 1.4039 - val_accuracy: 0.6848
Epoch 17/200
532/532 [==============================] - 17s 31ms/step - loss: 0.1647 - accuracy: 0.9473 - val_loss: 1.4781 - val_accuracy: 0.6832
Epoch 18/200
532/532 [==============================] - 17s 31ms/step - loss: 0.1592 - accuracy: 0.9491 - val_loss: 1.5260 - val_accuracy: 0.6775
Epoch 19/200
532/532 [==============================] - 17s 31ms/step - loss: 0.1506 - accuracy: 0.9523 - val_loss: 1.5402 - val_accuracy: 0.6752
Epoch 20/200
532/532 [==============================] - 17s 31ms/step - loss: 0.1421 - accuracy: 0.9538 - val_loss: 1.5716 - val_accuracy: 0.6755
Epoch 21/200
532/532 [==============================] - 17s 31ms/step - loss: 0.1412 - accuracy: 0.9558 - val_loss: 1.5706 - val_accuracy: 0.6789
Epoch 22/200
532/532 [==============================] - 17s 31ms/step - loss: 0.1349 - accuracy: 0.9584 - val_loss: 1.5603 - val_accuracy: 0.6828
Epoch 23/200
532/532 [==============================] - 17s 32ms/step - loss: 0.1198 - accuracy: 0.9629 - val_loss: 1.6429 - val_accuracy: 0.6725
Epoch 24/200
532/532 [==============================] - 17s 32ms/step - loss: 0.1142 - accuracy: 0.9616 - val_loss: 1.5825 - val_accuracy: 0.6752
Epoch 25/200
532/532 [==============================] - 17s 31ms/step - loss: 0.1114 - accuracy: 0.9643 - val_loss: 1.6196 - val_accuracy: 0.6745
Epoch 26/200
532/532 [==============================] - 17s 31ms/step - loss: 0.1081 - accuracy: 0.9659 - val_loss: 1.6664 - val_accuracy: 0.6739
Epoch 27/200
532/532 [==============================] - 17s 31ms/step - loss: 0.1034 - accuracy: 0.9661 - val_loss: 1.6292 - val_accuracy: 0.6759
Epoch 28/200
532/532 [==============================] - 17s 31ms/step - loss: 0.1010 - accuracy: 0.9701 - val_loss: 1.6768 - val_accuracy: 0.6686
Epoch 29/200
532/532 [==============================] - 17s 32ms/step - loss: 0.0851 - accuracy: 0.9723 - val_loss: 1.6419 - val_accuracy: 0.6895
Test set evaluation metrics
---------------------------
Loss:     1.245
Accuracy: 66.450%

Αριθμός κλάσεων = 60

Δίκτυα "from scratch"

In [ ]:
# Number of classes
number_of_classes = 60

accuracies_opt_60 = {}
Simple CNN
In [ ]:
SIMPLE_MODEL_OPTIMIZED = init_simple_model_optimized(summary = True, classes_num = number_of_classes)
accuracies_opt_60["SIMPLE_MODEL"] = fit_and_test_model(number_of_classes, SIMPLE_MODEL_OPTIMIZED, "Simple Model")
Model: "sequential"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
conv2d (Conv2D)              (None, 30, 30, 32)        896       
_________________________________________________________________
batch_normalization (BatchNo (None, 30, 30, 32)        128       
_________________________________________________________________
re_lu (ReLU)                 (None, 30, 30, 32)        0         
_________________________________________________________________
max_pooling2d (MaxPooling2D) (None, 15, 15, 32)        0         
_________________________________________________________________
dropout (Dropout)            (None, 15, 15, 32)        0         
_________________________________________________________________
conv2d_1 (Conv2D)            (None, 13, 13, 64)        18496     
_________________________________________________________________
batch_normalization_1 (Batch (None, 13, 13, 64)        256       
_________________________________________________________________
re_lu_1 (ReLU)               (None, 13, 13, 64)        0         
_________________________________________________________________
max_pooling2d_1 (MaxPooling2 (None, 6, 6, 64)          0         
_________________________________________________________________
dropout_1 (Dropout)          (None, 6, 6, 64)          0         
_________________________________________________________________
conv2d_2 (Conv2D)            (None, 4, 4, 64)          36928     
_________________________________________________________________
batch_normalization_2 (Batch (None, 4, 4, 64)          256       
_________________________________________________________________
re_lu_2 (ReLU)               (None, 4, 4, 64)          0         
_________________________________________________________________
flatten (Flatten)            (None, 1024)              0         
_________________________________________________________________
dropout_2 (Dropout)          (None, 1024)              0         
_________________________________________________________________
dense (Dense)                (None, 64)                65600     
_________________________________________________________________
dense_1 (Dense)              (None, 60)                3900      
=================================================================
Total params: 126,460
Trainable params: 126,140
Non-trainable params: 320
_________________________________________________________________
Epoch 1/200
797/797 [==============================] - 10s 5ms/step - loss: 5.1846 - accuracy: 0.0339 - val_loss: 4.6650 - val_accuracy: 0.0940
Epoch 2/200
797/797 [==============================] - 3s 4ms/step - loss: 4.5015 - accuracy: 0.1109 - val_loss: 4.2282 - val_accuracy: 0.1332
Epoch 3/200
797/797 [==============================] - 3s 4ms/step - loss: 4.0269 - accuracy: 0.1604 - val_loss: 3.8110 - val_accuracy: 0.1866
Epoch 4/200
797/797 [==============================] - 3s 4ms/step - loss: 3.6912 - accuracy: 0.2025 - val_loss: 3.4789 - val_accuracy: 0.2314
Epoch 5/200
797/797 [==============================] - 3s 4ms/step - loss: 3.4483 - accuracy: 0.2353 - val_loss: 3.3508 - val_accuracy: 0.2447
Epoch 6/200
797/797 [==============================] - 3s 4ms/step - loss: 3.2505 - accuracy: 0.2628 - val_loss: 3.2416 - val_accuracy: 0.2507
Epoch 7/200
797/797 [==============================] - 3s 4ms/step - loss: 3.0965 - accuracy: 0.2859 - val_loss: 3.2121 - val_accuracy: 0.2560
Epoch 8/200
797/797 [==============================] - 3s 4ms/step - loss: 2.9593 - accuracy: 0.3133 - val_loss: 3.1255 - val_accuracy: 0.2666
Epoch 9/200
797/797 [==============================] - 3s 4ms/step - loss: 2.8642 - accuracy: 0.3223 - val_loss: 2.9486 - val_accuracy: 0.2950
Epoch 10/200
797/797 [==============================] - 3s 4ms/step - loss: 2.7679 - accuracy: 0.3383 - val_loss: 2.8461 - val_accuracy: 0.3198
Epoch 11/200
797/797 [==============================] - 3s 4ms/step - loss: 2.6937 - accuracy: 0.3572 - val_loss: 2.8117 - val_accuracy: 0.3247
Epoch 12/200
797/797 [==============================] - 3s 4ms/step - loss: 2.6017 - accuracy: 0.3708 - val_loss: 2.6880 - val_accuracy: 0.3508
Epoch 13/200
797/797 [==============================] - 3s 4ms/step - loss: 2.5315 - accuracy: 0.3794 - val_loss: 2.6080 - val_accuracy: 0.3557
Epoch 14/200
797/797 [==============================] - 3s 4ms/step - loss: 2.4886 - accuracy: 0.3916 - val_loss: 2.7213 - val_accuracy: 0.3378
Epoch 15/200
797/797 [==============================] - 3s 4ms/step - loss: 2.4242 - accuracy: 0.3944 - val_loss: 2.6069 - val_accuracy: 0.3577
Epoch 16/200
797/797 [==============================] - 3s 4ms/step - loss: 2.3914 - accuracy: 0.4068 - val_loss: 2.5589 - val_accuracy: 0.3750
Epoch 17/200
797/797 [==============================] - 3s 4ms/step - loss: 2.3361 - accuracy: 0.4200 - val_loss: 2.3971 - val_accuracy: 0.4029
Epoch 18/200
797/797 [==============================] - 3s 4ms/step - loss: 2.3088 - accuracy: 0.4250 - val_loss: 2.4960 - val_accuracy: 0.3865
Epoch 19/200
797/797 [==============================] - 3s 4ms/step - loss: 2.2644 - accuracy: 0.4326 - val_loss: 2.5198 - val_accuracy: 0.3743
Epoch 20/200
797/797 [==============================] - 3s 4ms/step - loss: 2.2300 - accuracy: 0.4375 - val_loss: 2.4650 - val_accuracy: 0.3903
Epoch 21/200
797/797 [==============================] - 3s 4ms/step - loss: 2.1719 - accuracy: 0.4515 - val_loss: 2.4544 - val_accuracy: 0.3932
Epoch 22/200
797/797 [==============================] - 3s 4ms/step - loss: 2.1639 - accuracy: 0.4513 - val_loss: 2.2048 - val_accuracy: 0.4419
Epoch 23/200
797/797 [==============================] - 3s 4ms/step - loss: 2.1303 - accuracy: 0.4562 - val_loss: 2.2415 - val_accuracy: 0.4371
Epoch 24/200
797/797 [==============================] - 3s 4ms/step - loss: 2.0867 - accuracy: 0.4690 - val_loss: 2.2810 - val_accuracy: 0.4359
Epoch 25/200
797/797 [==============================] - 3s 4ms/step - loss: 2.0974 - accuracy: 0.4659 - val_loss: 2.1469 - val_accuracy: 0.4619
Epoch 26/200
797/797 [==============================] - 3s 4ms/step - loss: 2.0615 - accuracy: 0.4737 - val_loss: 2.2591 - val_accuracy: 0.4366
Epoch 27/200
797/797 [==============================] - 3s 4ms/step - loss: 2.0185 - accuracy: 0.4808 - val_loss: 2.0909 - val_accuracy: 0.4763
Epoch 28/200
797/797 [==============================] - 3s 4ms/step - loss: 2.0069 - accuracy: 0.4833 - val_loss: 2.3108 - val_accuracy: 0.4246
Epoch 29/200
797/797 [==============================] - 3s 4ms/step - loss: 1.9797 - accuracy: 0.4923 - val_loss: 2.1530 - val_accuracy: 0.4572
Epoch 30/200
797/797 [==============================] - 3s 4ms/step - loss: 1.9706 - accuracy: 0.4961 - val_loss: 2.2086 - val_accuracy: 0.4499
Epoch 31/200
797/797 [==============================] - 3s 4ms/step - loss: 1.9436 - accuracy: 0.4934 - val_loss: 2.1011 - val_accuracy: 0.4683
Epoch 32/200
797/797 [==============================] - 3s 4ms/step - loss: 1.9376 - accuracy: 0.4981 - val_loss: 2.0562 - val_accuracy: 0.4789
Epoch 33/200
797/797 [==============================] - 3s 4ms/step - loss: 1.9155 - accuracy: 0.5049 - val_loss: 2.3190 - val_accuracy: 0.4158
Epoch 34/200
797/797 [==============================] - 3s 4ms/step - loss: 1.8887 - accuracy: 0.5078 - val_loss: 1.9994 - val_accuracy: 0.4878
Epoch 35/200
797/797 [==============================] - 3s 4ms/step - loss: 1.8804 - accuracy: 0.5095 - val_loss: 2.1440 - val_accuracy: 0.4561
Epoch 36/200
797/797 [==============================] - 3s 4ms/step - loss: 1.8649 - accuracy: 0.5151 - val_loss: 2.0004 - val_accuracy: 0.4845
Epoch 37/200
797/797 [==============================] - 3s 4ms/step - loss: 1.8716 - accuracy: 0.5187 - val_loss: 2.3334 - val_accuracy: 0.4264
Epoch 38/200
797/797 [==============================] - 3s 4ms/step - loss: 1.8354 - accuracy: 0.5198 - val_loss: 1.9930 - val_accuracy: 0.4894
Epoch 39/200
797/797 [==============================] - 3s 4ms/step - loss: 1.8343 - accuracy: 0.5229 - val_loss: 1.9606 - val_accuracy: 0.4987
Epoch 40/200
797/797 [==============================] - 3s 4ms/step - loss: 1.8108 - accuracy: 0.5293 - val_loss: 1.9773 - val_accuracy: 0.5002
Epoch 41/200
797/797 [==============================] - 3s 4ms/step - loss: 1.8100 - accuracy: 0.5307 - val_loss: 1.9936 - val_accuracy: 0.4984
Epoch 42/200
797/797 [==============================] - 3s 4ms/step - loss: 1.7924 - accuracy: 0.5343 - val_loss: 1.9586 - val_accuracy: 0.5071
Epoch 43/200
797/797 [==============================] - 3s 4ms/step - loss: 1.7852 - accuracy: 0.5327 - val_loss: 1.9313 - val_accuracy: 0.5058
Epoch 44/200
797/797 [==============================] - 3s 4ms/step - loss: 1.7859 - accuracy: 0.5324 - val_loss: 1.9423 - val_accuracy: 0.5029
Epoch 45/200
797/797 [==============================] - 3s 4ms/step - loss: 1.7551 - accuracy: 0.5412 - val_loss: 1.9747 - val_accuracy: 0.4973
Epoch 46/200
797/797 [==============================] - 3s 4ms/step - loss: 1.7717 - accuracy: 0.5376 - val_loss: 1.9243 - val_accuracy: 0.5173
Epoch 47/200
797/797 [==============================] - 3s 4ms/step - loss: 1.7448 - accuracy: 0.5469 - val_loss: 1.9639 - val_accuracy: 0.5066
Epoch 48/200
797/797 [==============================] - 3s 4ms/step - loss: 1.7456 - accuracy: 0.5411 - val_loss: 1.9238 - val_accuracy: 0.5073
Epoch 49/200
797/797 [==============================] - 3s 4ms/step - loss: 1.7252 - accuracy: 0.5450 - val_loss: 1.9393 - val_accuracy: 0.5066
Epoch 50/200
797/797 [==============================] - 3s 4ms/step - loss: 1.7052 - accuracy: 0.5497 - val_loss: 1.8795 - val_accuracy: 0.5213
Epoch 51/200
797/797 [==============================] - 3s 4ms/step - loss: 1.7014 - accuracy: 0.5548 - val_loss: 2.0129 - val_accuracy: 0.4927
Epoch 52/200
797/797 [==============================] - 3s 4ms/step - loss: 1.6995 - accuracy: 0.5539 - val_loss: 1.9751 - val_accuracy: 0.4929
Epoch 53/200
797/797 [==============================] - 3s 4ms/step - loss: 1.6926 - accuracy: 0.5544 - val_loss: 1.9816 - val_accuracy: 0.4936
Epoch 54/200
797/797 [==============================] - 3s 4ms/step - loss: 1.6839 - accuracy: 0.5641 - val_loss: 1.9030 - val_accuracy: 0.5084
Epoch 55/200
797/797 [==============================] - 3s 4ms/step - loss: 1.6786 - accuracy: 0.5565 - val_loss: 1.9985 - val_accuracy: 0.5022
Epoch 56/200
797/797 [==============================] - 3s 4ms/step - loss: 1.6638 - accuracy: 0.5592 - val_loss: 1.9901 - val_accuracy: 0.4980
Epoch 57/200
797/797 [==============================] - 3s 4ms/step - loss: 1.6603 - accuracy: 0.5594 - val_loss: 1.8798 - val_accuracy: 0.5151
Epoch 58/200
797/797 [==============================] - 3s 4ms/step - loss: 1.6508 - accuracy: 0.5634 - val_loss: 1.9688 - val_accuracy: 0.4971
Epoch 59/200
797/797 [==============================] - 3s 4ms/step - loss: 1.6413 - accuracy: 0.5658 - val_loss: 1.9503 - val_accuracy: 0.5007
Epoch 60/200
797/797 [==============================] - 3s 4ms/step - loss: 1.6421 - accuracy: 0.5658 - val_loss: 1.9776 - val_accuracy: 0.4956
Epoch 61/200
797/797 [==============================] - 3s 4ms/step - loss: 1.6198 - accuracy: 0.5752 - val_loss: 1.8298 - val_accuracy: 0.5328
Epoch 62/200
797/797 [==============================] - 3s 4ms/step - loss: 1.6314 - accuracy: 0.5693 - val_loss: 1.8567 - val_accuracy: 0.5288
Epoch 63/200
797/797 [==============================] - 3s 4ms/step - loss: 1.6059 - accuracy: 0.5762 - val_loss: 1.8502 - val_accuracy: 0.5248
Epoch 64/200
797/797 [==============================] - 3s 4ms/step - loss: 1.6171 - accuracy: 0.5767 - val_loss: 2.0295 - val_accuracy: 0.4856
Epoch 65/200
797/797 [==============================] - 3s 4ms/step - loss: 1.6213 - accuracy: 0.5719 - val_loss: 1.8956 - val_accuracy: 0.5122
Epoch 66/200
797/797 [==============================] - 3s 4ms/step - loss: 1.5906 - accuracy: 0.5765 - val_loss: 1.8489 - val_accuracy: 0.5299
Epoch 67/200
797/797 [==============================] - 3s 4ms/step - loss: 1.5906 - accuracy: 0.5840 - val_loss: 2.0001 - val_accuracy: 0.4949
Epoch 68/200
797/797 [==============================] - 3s 4ms/step - loss: 1.5902 - accuracy: 0.5824 - val_loss: 1.8046 - val_accuracy: 0.5301
Epoch 69/200
797/797 [==============================] - 3s 4ms/step - loss: 1.6067 - accuracy: 0.5759 - val_loss: 1.9957 - val_accuracy: 0.4925
Epoch 70/200
797/797 [==============================] - 3s 4ms/step - loss: 1.5826 - accuracy: 0.5833 - val_loss: 1.9269 - val_accuracy: 0.5106
Epoch 71/200
797/797 [==============================] - 3s 4ms/step - loss: 1.5674 - accuracy: 0.5829 - val_loss: 2.0061 - val_accuracy: 0.4931
Epoch 72/200
797/797 [==============================] - 3s 4ms/step - loss: 1.5596 - accuracy: 0.5884 - val_loss: 1.9038 - val_accuracy: 0.5131
Epoch 73/200
797/797 [==============================] - 3s 4ms/step - loss: 1.5696 - accuracy: 0.5839 - val_loss: 1.8231 - val_accuracy: 0.5315
Epoch 74/200
797/797 [==============================] - 3s 4ms/step - loss: 1.5633 - accuracy: 0.5845 - val_loss: 1.9010 - val_accuracy: 0.5160
Epoch 75/200
797/797 [==============================] - 3s 4ms/step - loss: 1.5585 - accuracy: 0.5823 - val_loss: 2.0885 - val_accuracy: 0.4883
Epoch 76/200
797/797 [==============================] - 3s 4ms/step - loss: 1.5602 - accuracy: 0.5885 - val_loss: 1.8747 - val_accuracy: 0.5151
Epoch 77/200
797/797 [==============================] - 3s 4ms/step - loss: 1.5328 - accuracy: 0.5916 - val_loss: 2.0046 - val_accuracy: 0.5038
Epoch 78/200
797/797 [==============================] - 3s 4ms/step - loss: 1.5416 - accuracy: 0.5910 - val_loss: 1.8888 - val_accuracy: 0.5233
Epoch 79/200
797/797 [==============================] - 3s 4ms/step - loss: 1.5370 - accuracy: 0.5949 - val_loss: 1.9465 - val_accuracy: 0.5044
Epoch 80/200
797/797 [==============================] - 3s 4ms/step - loss: 1.5333 - accuracy: 0.5944 - val_loss: 1.9529 - val_accuracy: 0.5084
Epoch 81/200
797/797 [==============================] - 3s 4ms/step - loss: 1.5190 - accuracy: 0.5973 - val_loss: 1.9300 - val_accuracy: 0.5160
Epoch 82/200
797/797 [==============================] - 3s 4ms/step - loss: 1.5286 - accuracy: 0.5970 - val_loss: 1.9389 - val_accuracy: 0.5060
Epoch 83/200
797/797 [==============================] - 3s 4ms/step - loss: 1.5422 - accuracy: 0.5893 - val_loss: 1.7904 - val_accuracy: 0.5379
Epoch 84/200
797/797 [==============================] - 3s 4ms/step - loss: 1.5007 - accuracy: 0.6072 - val_loss: 1.8034 - val_accuracy: 0.5372
Epoch 85/200
797/797 [==============================] - 3s 4ms/step - loss: 1.5088 - accuracy: 0.5979 - val_loss: 1.9398 - val_accuracy: 0.5126
Epoch 86/200
797/797 [==============================] - 3s 4ms/step - loss: 1.4958 - accuracy: 0.5985 - val_loss: 1.9482 - val_accuracy: 0.5095
Epoch 87/200
797/797 [==============================] - 3s 4ms/step - loss: 1.5014 - accuracy: 0.6030 - val_loss: 1.8516 - val_accuracy: 0.5315
Epoch 88/200
797/797 [==============================] - 3s 4ms/step - loss: 1.5001 - accuracy: 0.6035 - val_loss: 1.7985 - val_accuracy: 0.5403
Epoch 89/200
797/797 [==============================] - 3s 4ms/step - loss: 1.4974 - accuracy: 0.6008 - val_loss: 1.9222 - val_accuracy: 0.5109
Epoch 90/200
797/797 [==============================] - 3s 4ms/step - loss: 1.4818 - accuracy: 0.6086 - val_loss: 1.9702 - val_accuracy: 0.5029
Epoch 91/200
797/797 [==============================] - 3s 4ms/step - loss: 1.4999 - accuracy: 0.6036 - val_loss: 2.0535 - val_accuracy: 0.4900
Epoch 92/200
797/797 [==============================] - 3s 4ms/step - loss: 1.4929 - accuracy: 0.6059 - val_loss: 1.9777 - val_accuracy: 0.5089
Epoch 93/200
797/797 [==============================] - 3s 4ms/step - loss: 1.4946 - accuracy: 0.5997 - val_loss: 1.8930 - val_accuracy: 0.5257
Epoch 94/200
797/797 [==============================] - 3s 4ms/step - loss: 1.4769 - accuracy: 0.6102 - val_loss: 1.8543 - val_accuracy: 0.5319
Epoch 95/200
797/797 [==============================] - 3s 4ms/step - loss: 1.4754 - accuracy: 0.6054 - val_loss: 1.8551 - val_accuracy: 0.5312
Epoch 96/200
797/797 [==============================] - 3s 4ms/step - loss: 1.4690 - accuracy: 0.6068 - val_loss: 1.9288 - val_accuracy: 0.5142
Epoch 97/200
797/797 [==============================] - 3s 4ms/step - loss: 1.4573 - accuracy: 0.6105 - val_loss: 1.9267 - val_accuracy: 0.5193
Epoch 98/200
797/797 [==============================] - 3s 4ms/step - loss: 1.4613 - accuracy: 0.6043 - val_loss: 1.9160 - val_accuracy: 0.5155
Epoch 99/200
797/797 [==============================] - 3s 4ms/step - loss: 1.4739 - accuracy: 0.6094 - val_loss: 1.9065 - val_accuracy: 0.5215
Epoch 100/200
797/797 [==============================] - 3s 4ms/step - loss: 1.4584 - accuracy: 0.6114 - val_loss: 1.7586 - val_accuracy: 0.5479
Epoch 101/200
797/797 [==============================] - 3s 4ms/step - loss: 1.4593 - accuracy: 0.6113 - val_loss: 1.9042 - val_accuracy: 0.5248
Epoch 102/200
797/797 [==============================] - 3s 4ms/step - loss: 1.4723 - accuracy: 0.6057 - val_loss: 1.9766 - val_accuracy: 0.5171
Epoch 103/200
797/797 [==============================] - 3s 4ms/step - loss: 1.4368 - accuracy: 0.6191 - val_loss: 1.9733 - val_accuracy: 0.5131
Epoch 104/200
797/797 [==============================] - 3s 4ms/step - loss: 1.4224 - accuracy: 0.6248 - val_loss: 1.8266 - val_accuracy: 0.5330
Epoch 105/200
797/797 [==============================] - 3s 4ms/step - loss: 1.4284 - accuracy: 0.6211 - val_loss: 2.0118 - val_accuracy: 0.5024
Epoch 106/200
797/797 [==============================] - 3s 4ms/step - loss: 1.4279 - accuracy: 0.6176 - val_loss: 1.9120 - val_accuracy: 0.5213
Epoch 107/200
797/797 [==============================] - 3s 4ms/step - loss: 1.4191 - accuracy: 0.6253 - val_loss: 1.9463 - val_accuracy: 0.5188
Epoch 108/200
797/797 [==============================] - 3s 4ms/step - loss: 1.4449 - accuracy: 0.6161 - val_loss: 1.9305 - val_accuracy: 0.5224
Epoch 109/200
797/797 [==============================] - 3s 4ms/step - loss: 1.3963 - accuracy: 0.6323 - val_loss: 1.9131 - val_accuracy: 0.5180
Epoch 110/200
797/797 [==============================] - 3s 4ms/step - loss: 1.4333 - accuracy: 0.6208 - val_loss: 1.9605 - val_accuracy: 0.5177
Epoch 111/200
797/797 [==============================] - 3s 4ms/step - loss: 1.4260 - accuracy: 0.6195 - val_loss: 1.9660 - val_accuracy: 0.5082
Epoch 112/200
797/797 [==============================] - 3s 4ms/step - loss: 1.4088 - accuracy: 0.6269 - val_loss: 1.9220 - val_accuracy: 0.5299
Epoch 113/200
797/797 [==============================] - 3s 4ms/step - loss: 1.4063 - accuracy: 0.6259 - val_loss: 1.8483 - val_accuracy: 0.5375
Epoch 114/200
797/797 [==============================] - 3s 4ms/step - loss: 1.4234 - accuracy: 0.6215 - val_loss: 1.7681 - val_accuracy: 0.5539
Epoch 115/200
797/797 [==============================] - 3s 4ms/step - loss: 1.4081 - accuracy: 0.6244 - val_loss: 1.9305 - val_accuracy: 0.5273
Epoch 116/200
797/797 [==============================] - 3s 4ms/step - loss: 1.4197 - accuracy: 0.6205 - val_loss: 1.8802 - val_accuracy: 0.5332
Epoch 117/200
797/797 [==============================] - 3s 4ms/step - loss: 1.4025 - accuracy: 0.6198 - val_loss: 1.8444 - val_accuracy: 0.5383
Epoch 118/200
797/797 [==============================] - 3s 4ms/step - loss: 1.4070 - accuracy: 0.6241 - val_loss: 1.8507 - val_accuracy: 0.5430
Epoch 119/200
797/797 [==============================] - 3s 4ms/step - loss: 1.4013 - accuracy: 0.6229 - val_loss: 1.8545 - val_accuracy: 0.5326
Epoch 120/200
797/797 [==============================] - 3s 4ms/step - loss: 1.3992 - accuracy: 0.6286 - val_loss: 1.7964 - val_accuracy: 0.5392
Test set evaluation metrics
---------------------------
Loss:     1.734
Accuracy: 55.552%
CNN1
In [ ]:
CNN1_MODEL_OPTIMIZED = init_cnn1_model_optimized(summary = True, classes_num = number_of_classes)
accuracies_opt_60["CNN1"] = fit_and_test_model(number_of_classes, CNN1_MODEL_OPTIMIZED, "Cnn1")
Model: "sequential_1"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
conv2d_3 (Conv2D)            (None, 30, 30, 32)        896       
_________________________________________________________________
batch_normalization_3 (Batch (None, 30, 30, 32)        128       
_________________________________________________________________
re_lu_3 (ReLU)               (None, 30, 30, 32)        0         
_________________________________________________________________
max_pooling2d_2 (MaxPooling2 (None, 15, 15, 32)        0         
_________________________________________________________________
dropout_3 (Dropout)          (None, 15, 15, 32)        0         
_________________________________________________________________
conv2d_4 (Conv2D)            (None, 13, 13, 64)        18496     
_________________________________________________________________
batch_normalization_4 (Batch (None, 13, 13, 64)        256       
_________________________________________________________________
re_lu_4 (ReLU)               (None, 13, 13, 64)        0         
_________________________________________________________________
max_pooling2d_3 (MaxPooling2 (None, 6, 6, 64)          0         
_________________________________________________________________
dropout_4 (Dropout)          (None, 6, 6, 64)          0         
_________________________________________________________________
conv2d_5 (Conv2D)            (None, 4, 4, 128)         73856     
_________________________________________________________________
batch_normalization_5 (Batch (None, 4, 4, 128)         512       
_________________________________________________________________
re_lu_5 (ReLU)               (None, 4, 4, 128)         0         
_________________________________________________________________
average_pooling2d (AveragePo (None, 2, 2, 128)         0         
_________________________________________________________________
dropout_5 (Dropout)          (None, 2, 2, 128)         0         
_________________________________________________________________
flatten_1 (Flatten)          (None, 512)               0         
_________________________________________________________________
dense_2 (Dense)              (None, 1024)              525312    
_________________________________________________________________
dropout_6 (Dropout)          (None, 1024)              0         
_________________________________________________________________
dense_3 (Dense)              (None, 60)                61500     
=================================================================
Total params: 680,956
Trainable params: 680,508
Non-trainable params: 448
_________________________________________________________________
Epoch 1/200
797/797 [==============================] - 4s 4ms/step - loss: 5.1043 - accuracy: 0.0706 - val_loss: 4.3558 - val_accuracy: 0.1569
Epoch 2/200
797/797 [==============================] - 3s 4ms/step - loss: 4.1248 - accuracy: 0.1777 - val_loss: 3.7967 - val_accuracy: 0.2117
Epoch 3/200
797/797 [==============================] - 3s 4ms/step - loss: 3.6497 - accuracy: 0.2366 - val_loss: 3.5137 - val_accuracy: 0.2431
Epoch 4/200
797/797 [==============================] - 3s 4ms/step - loss: 3.3150 - accuracy: 0.2834 - val_loss: 3.4217 - val_accuracy: 0.2358
Epoch 5/200
797/797 [==============================] - 3s 4ms/step - loss: 3.0823 - accuracy: 0.3116 - val_loss: 3.1723 - val_accuracy: 0.2781
Epoch 6/200
797/797 [==============================] - 3s 4ms/step - loss: 2.9038 - accuracy: 0.3374 - val_loss: 2.7878 - val_accuracy: 0.3553
Epoch 7/200
797/797 [==============================] - 3s 4ms/step - loss: 2.7732 - accuracy: 0.3505 - val_loss: 2.9894 - val_accuracy: 0.3001
Epoch 8/200
797/797 [==============================] - 3s 4ms/step - loss: 2.6521 - accuracy: 0.3743 - val_loss: 2.8475 - val_accuracy: 0.3256
Epoch 9/200
797/797 [==============================] - 3s 4ms/step - loss: 2.5544 - accuracy: 0.3900 - val_loss: 2.6671 - val_accuracy: 0.3648
Epoch 10/200
797/797 [==============================] - 3s 4ms/step - loss: 2.4692 - accuracy: 0.4064 - val_loss: 2.7537 - val_accuracy: 0.3500
Epoch 11/200
797/797 [==============================] - 3s 4ms/step - loss: 2.3967 - accuracy: 0.4209 - val_loss: 2.5396 - val_accuracy: 0.3916
Epoch 12/200
797/797 [==============================] - 3s 4ms/step - loss: 2.3210 - accuracy: 0.4293 - val_loss: 2.3990 - val_accuracy: 0.4209
Epoch 13/200
797/797 [==============================] - 3s 4ms/step - loss: 2.2682 - accuracy: 0.4400 - val_loss: 2.2959 - val_accuracy: 0.4366
Epoch 14/200
797/797 [==============================] - 3s 4ms/step - loss: 2.2294 - accuracy: 0.4428 - val_loss: 2.4653 - val_accuracy: 0.4060
Epoch 15/200
797/797 [==============================] - 3s 4ms/step - loss: 2.1667 - accuracy: 0.4568 - val_loss: 2.3072 - val_accuracy: 0.4322
Epoch 16/200
797/797 [==============================] - 3s 4ms/step - loss: 2.1237 - accuracy: 0.4733 - val_loss: 2.2375 - val_accuracy: 0.4517
Epoch 17/200
797/797 [==============================] - 3s 4ms/step - loss: 2.0826 - accuracy: 0.4732 - val_loss: 2.1562 - val_accuracy: 0.4663
Epoch 18/200
797/797 [==============================] - 3s 4ms/step - loss: 2.0510 - accuracy: 0.4781 - val_loss: 2.1593 - val_accuracy: 0.4650
Epoch 19/200
797/797 [==============================] - 3s 4ms/step - loss: 2.0199 - accuracy: 0.4831 - val_loss: 2.1418 - val_accuracy: 0.4738
Epoch 20/200
797/797 [==============================] - 3s 4ms/step - loss: 1.9906 - accuracy: 0.4928 - val_loss: 2.1496 - val_accuracy: 0.4668
Epoch 21/200
797/797 [==============================] - 3s 4ms/step - loss: 1.9491 - accuracy: 0.5002 - val_loss: 2.0986 - val_accuracy: 0.4738
Epoch 22/200
797/797 [==============================] - 3s 4ms/step - loss: 1.9275 - accuracy: 0.5136 - val_loss: 2.0784 - val_accuracy: 0.4836
Epoch 23/200
797/797 [==============================] - 3s 4ms/step - loss: 1.9059 - accuracy: 0.5139 - val_loss: 1.9953 - val_accuracy: 0.4996
Epoch 24/200
797/797 [==============================] - 3s 4ms/step - loss: 1.8721 - accuracy: 0.5167 - val_loss: 2.0190 - val_accuracy: 0.4940
Epoch 25/200
797/797 [==============================] - 3s 4ms/step - loss: 1.8634 - accuracy: 0.5213 - val_loss: 2.0314 - val_accuracy: 0.4918
Epoch 26/200
797/797 [==============================] - 3s 4ms/step - loss: 1.8104 - accuracy: 0.5352 - val_loss: 2.0090 - val_accuracy: 0.4976
Epoch 27/200
797/797 [==============================] - 3s 4ms/step - loss: 1.7942 - accuracy: 0.5313 - val_loss: 1.9534 - val_accuracy: 0.5049
Epoch 28/200
797/797 [==============================] - 3s 4ms/step - loss: 1.7833 - accuracy: 0.5356 - val_loss: 1.9362 - val_accuracy: 0.5093
Epoch 29/200
797/797 [==============================] - 3s 4ms/step - loss: 1.7564 - accuracy: 0.5424 - val_loss: 1.9305 - val_accuracy: 0.5188
Epoch 30/200
797/797 [==============================] - 3s 4ms/step - loss: 1.7414 - accuracy: 0.5474 - val_loss: 1.8167 - val_accuracy: 0.5401
Epoch 31/200
797/797 [==============================] - 3s 4ms/step - loss: 1.7233 - accuracy: 0.5516 - val_loss: 1.9011 - val_accuracy: 0.5180
Epoch 32/200
797/797 [==============================] - 3s 4ms/step - loss: 1.7178 - accuracy: 0.5536 - val_loss: 2.0238 - val_accuracy: 0.4876
Epoch 33/200
797/797 [==============================] - 3s 4ms/step - loss: 1.6803 - accuracy: 0.5578 - val_loss: 1.8206 - val_accuracy: 0.5395
Epoch 34/200
797/797 [==============================] - 3s 4ms/step - loss: 1.6773 - accuracy: 0.5602 - val_loss: 1.8482 - val_accuracy: 0.5312
Epoch 35/200
797/797 [==============================] - 3s 4ms/step - loss: 1.6646 - accuracy: 0.5667 - val_loss: 1.8593 - val_accuracy: 0.5306
Epoch 36/200
797/797 [==============================] - 3s 4ms/step - loss: 1.6445 - accuracy: 0.5674 - val_loss: 1.8154 - val_accuracy: 0.5401
Epoch 37/200
797/797 [==============================] - 3s 4ms/step - loss: 1.6189 - accuracy: 0.5699 - val_loss: 1.9521 - val_accuracy: 0.5086
Epoch 38/200
797/797 [==============================] - 3s 4ms/step - loss: 1.6047 - accuracy: 0.5756 - val_loss: 1.8713 - val_accuracy: 0.5270
Epoch 39/200
797/797 [==============================] - 3s 4ms/step - loss: 1.6272 - accuracy: 0.5753 - val_loss: 1.8484 - val_accuracy: 0.5328
Epoch 40/200
797/797 [==============================] - 3s 4ms/step - loss: 1.5676 - accuracy: 0.5892 - val_loss: 1.8362 - val_accuracy: 0.5312
Epoch 41/200
797/797 [==============================] - 3s 4ms/step - loss: 1.5575 - accuracy: 0.5902 - val_loss: 1.7919 - val_accuracy: 0.5472
Epoch 42/200
797/797 [==============================] - 3s 4ms/step - loss: 1.5716 - accuracy: 0.5868 - val_loss: 1.8852 - val_accuracy: 0.5262
Epoch 43/200
797/797 [==============================] - 3s 4ms/step - loss: 1.5433 - accuracy: 0.5931 - val_loss: 1.7818 - val_accuracy: 0.5503
Epoch 44/200
797/797 [==============================] - 3s 4ms/step - loss: 1.5380 - accuracy: 0.5912 - val_loss: 1.8345 - val_accuracy: 0.5297
Epoch 45/200
797/797 [==============================] - 3s 4ms/step - loss: 1.5238 - accuracy: 0.5973 - val_loss: 1.7638 - val_accuracy: 0.5607
Epoch 46/200
797/797 [==============================] - 3s 4ms/step - loss: 1.5177 - accuracy: 0.5993 - val_loss: 1.8171 - val_accuracy: 0.5457
Epoch 47/200
797/797 [==============================] - 3s 4ms/step - loss: 1.4709 - accuracy: 0.6123 - val_loss: 2.0262 - val_accuracy: 0.5044
Epoch 48/200
797/797 [==============================] - 3s 4ms/step - loss: 1.5105 - accuracy: 0.6034 - val_loss: 1.8278 - val_accuracy: 0.5423
Epoch 49/200
797/797 [==============================] - 3s 4ms/step - loss: 1.4914 - accuracy: 0.6015 - val_loss: 1.7873 - val_accuracy: 0.5485
Epoch 50/200
797/797 [==============================] - 3s 4ms/step - loss: 1.4765 - accuracy: 0.6067 - val_loss: 1.8015 - val_accuracy: 0.5439
Epoch 51/200
797/797 [==============================] - 3s 4ms/step - loss: 1.4655 - accuracy: 0.6150 - val_loss: 1.8447 - val_accuracy: 0.5395
Epoch 52/200
797/797 [==============================] - 3s 4ms/step - loss: 1.4519 - accuracy: 0.6197 - val_loss: 1.7789 - val_accuracy: 0.5483
Epoch 53/200
797/797 [==============================] - 3s 4ms/step - loss: 1.4250 - accuracy: 0.6208 - val_loss: 1.7738 - val_accuracy: 0.5479
Epoch 54/200
797/797 [==============================] - 3s 4ms/step - loss: 1.4263 - accuracy: 0.6236 - val_loss: 1.7904 - val_accuracy: 0.5457
Epoch 55/200
797/797 [==============================] - 3s 4ms/step - loss: 1.4292 - accuracy: 0.6235 - val_loss: 1.7552 - val_accuracy: 0.5612
Epoch 56/200
797/797 [==============================] - 3s 4ms/step - loss: 1.3940 - accuracy: 0.6260 - val_loss: 1.6504 - val_accuracy: 0.5773
Epoch 57/200
797/797 [==============================] - 3s 4ms/step - loss: 1.3894 - accuracy: 0.6283 - val_loss: 1.7501 - val_accuracy: 0.5570
Epoch 58/200
797/797 [==============================] - 3s 4ms/step - loss: 1.3958 - accuracy: 0.6311 - val_loss: 1.7160 - val_accuracy: 0.5658
Epoch 59/200
797/797 [==============================] - 3s 4ms/step - loss: 1.3946 - accuracy: 0.6329 - val_loss: 1.6644 - val_accuracy: 0.5751
Epoch 60/200
797/797 [==============================] - 3s 4ms/step - loss: 1.3675 - accuracy: 0.6383 - val_loss: 1.7025 - val_accuracy: 0.5756
Epoch 61/200
797/797 [==============================] - 3s 4ms/step - loss: 1.3515 - accuracy: 0.6404 - val_loss: 1.7510 - val_accuracy: 0.5647
Epoch 62/200
797/797 [==============================] - 3s 4ms/step - loss: 1.3445 - accuracy: 0.6439 - val_loss: 1.7420 - val_accuracy: 0.5663
Epoch 63/200
797/797 [==============================] - 3s 4ms/step - loss: 1.3582 - accuracy: 0.6415 - val_loss: 1.6919 - val_accuracy: 0.5716
Epoch 64/200
797/797 [==============================] - 3s 4ms/step - loss: 1.3455 - accuracy: 0.6431 - val_loss: 1.7770 - val_accuracy: 0.5494
Epoch 65/200
797/797 [==============================] - 3s 4ms/step - loss: 1.3131 - accuracy: 0.6518 - val_loss: 1.7715 - val_accuracy: 0.5554
Epoch 66/200
797/797 [==============================] - 3s 4ms/step - loss: 1.3296 - accuracy: 0.6476 - val_loss: 1.6666 - val_accuracy: 0.5745
Epoch 67/200
797/797 [==============================] - 3s 4ms/step - loss: 1.3100 - accuracy: 0.6529 - val_loss: 1.7694 - val_accuracy: 0.5525
Epoch 68/200
797/797 [==============================] - 3s 4ms/step - loss: 1.3023 - accuracy: 0.6550 - val_loss: 1.7393 - val_accuracy: 0.5605
Epoch 69/200
797/797 [==============================] - 3s 4ms/step - loss: 1.2966 - accuracy: 0.6599 - val_loss: 1.7212 - val_accuracy: 0.5647
Epoch 70/200
797/797 [==============================] - 3s 4ms/step - loss: 1.2984 - accuracy: 0.6516 - val_loss: 1.7178 - val_accuracy: 0.5669
Epoch 71/200
797/797 [==============================] - 3s 4ms/step - loss: 1.2929 - accuracy: 0.6483 - val_loss: 1.7231 - val_accuracy: 0.5691
Epoch 72/200
797/797 [==============================] - 3s 4ms/step - loss: 1.2957 - accuracy: 0.6558 - val_loss: 1.6945 - val_accuracy: 0.5678
Epoch 73/200
797/797 [==============================] - 3s 4ms/step - loss: 1.2768 - accuracy: 0.6588 - val_loss: 1.7867 - val_accuracy: 0.5596
Epoch 74/200
797/797 [==============================] - 3s 4ms/step - loss: 1.2692 - accuracy: 0.6584 - val_loss: 1.7461 - val_accuracy: 0.5654
Epoch 75/200
797/797 [==============================] - 3s 4ms/step - loss: 1.2607 - accuracy: 0.6641 - val_loss: 1.7225 - val_accuracy: 0.5718
Epoch 76/200
797/797 [==============================] - 3s 4ms/step - loss: 1.2506 - accuracy: 0.6627 - val_loss: 1.7018 - val_accuracy: 0.5751
Test set evaluation metrics
---------------------------
Loss:     1.646
Accuracy: 57.596%
CNN2
In [ ]:
CNN2_MODEL_OPTIMIZED = init_cnn2_model_optimized(summary = True, classes_num = number_of_classes)
accuracies_opt_60["CNN2"] = fit_and_test_model(number_of_classes, CNN2_MODEL_OPTIMIZED, "Cnn2")
Model: "sequential_2"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
conv2d_6 (Conv2D)            (None, 32, 32, 32)        896       
_________________________________________________________________
batch_normalization_6 (Batch (None, 32, 32, 32)        128       
_________________________________________________________________
re_lu_6 (ReLU)               (None, 32, 32, 32)        0         
_________________________________________________________________
max_pooling2d_4 (MaxPooling2 (None, 16, 16, 32)        0         
_________________________________________________________________
dropout_7 (Dropout)          (None, 16, 16, 32)        0         
_________________________________________________________________
conv2d_7 (Conv2D)            (None, 16, 16, 64)        18496     
_________________________________________________________________
batch_normalization_7 (Batch (None, 16, 16, 64)        256       
_________________________________________________________________
re_lu_7 (ReLU)               (None, 16, 16, 64)        0         
_________________________________________________________________
max_pooling2d_5 (MaxPooling2 (None, 8, 8, 64)          0         
_________________________________________________________________
dropout_8 (Dropout)          (None, 8, 8, 64)          0         
_________________________________________________________________
conv2d_8 (Conv2D)            (None, 8, 8, 128)         73856     
_________________________________________________________________
batch_normalization_8 (Batch (None, 8, 8, 128)         512       
_________________________________________________________________
re_lu_8 (ReLU)               (None, 8, 8, 128)         0         
_________________________________________________________________
max_pooling2d_6 (MaxPooling2 (None, 4, 4, 128)         0         
_________________________________________________________________
dropout_9 (Dropout)          (None, 4, 4, 128)         0         
_________________________________________________________________
conv2d_9 (Conv2D)            (None, 4, 4, 256)         295168    
_________________________________________________________________
batch_normalization_9 (Batch (None, 4, 4, 256)         1024      
_________________________________________________________________
re_lu_9 (ReLU)               (None, 4, 4, 256)         0         
_________________________________________________________________
dropout_10 (Dropout)         (None, 4, 4, 256)         0         
_________________________________________________________________
flatten_2 (Flatten)          (None, 4096)              0         
_________________________________________________________________
dense_4 (Dense)              (None, 512)               2097664   
_________________________________________________________________
dropout_11 (Dropout)         (None, 512)               0         
_________________________________________________________________
dense_5 (Dense)              (None, 60)                30780     
=================================================================
Total params: 2,518,780
Trainable params: 2,517,820
Non-trainable params: 960
_________________________________________________________________
Epoch 1/200
797/797 [==============================] - 5s 5ms/step - loss: 6.8543 - accuracy: 0.0591 - val_loss: 5.8799 - val_accuracy: 0.1135
Epoch 2/200
797/797 [==============================] - 4s 5ms/step - loss: 5.5623 - accuracy: 0.1526 - val_loss: 4.9341 - val_accuracy: 0.1957
Epoch 3/200
797/797 [==============================] - 4s 5ms/step - loss: 4.7830 - accuracy: 0.2081 - val_loss: 4.3475 - val_accuracy: 0.2325
Epoch 4/200
797/797 [==============================] - 4s 5ms/step - loss: 4.2213 - accuracy: 0.2440 - val_loss: 4.0872 - val_accuracy: 0.2400
Epoch 5/200
797/797 [==============================] - 4s 5ms/step - loss: 3.7749 - accuracy: 0.2856 - val_loss: 3.6935 - val_accuracy: 0.2793
Epoch 6/200
797/797 [==============================] - 4s 5ms/step - loss: 3.4392 - accuracy: 0.3219 - val_loss: 3.4440 - val_accuracy: 0.3041
Epoch 7/200
797/797 [==============================] - 4s 5ms/step - loss: 3.1677 - accuracy: 0.3470 - val_loss: 3.4292 - val_accuracy: 0.2877
Epoch 8/200
797/797 [==============================] - 4s 5ms/step - loss: 2.9843 - accuracy: 0.3687 - val_loss: 3.0590 - val_accuracy: 0.3473
Epoch 9/200
797/797 [==============================] - 4s 5ms/step - loss: 2.7931 - accuracy: 0.3947 - val_loss: 3.0698 - val_accuracy: 0.3444
Epoch 10/200
797/797 [==============================] - 4s 5ms/step - loss: 2.6499 - accuracy: 0.4110 - val_loss: 2.8273 - val_accuracy: 0.3779
Epoch 11/200
797/797 [==============================] - 4s 5ms/step - loss: 2.5342 - accuracy: 0.4274 - val_loss: 2.5930 - val_accuracy: 0.4207
Epoch 12/200
797/797 [==============================] - 4s 5ms/step - loss: 2.4173 - accuracy: 0.4491 - val_loss: 2.6345 - val_accuracy: 0.4100
Epoch 13/200
797/797 [==============================] - 4s 5ms/step - loss: 2.3262 - accuracy: 0.4594 - val_loss: 2.3578 - val_accuracy: 0.4650
Epoch 14/200
797/797 [==============================] - 4s 5ms/step - loss: 2.2696 - accuracy: 0.4729 - val_loss: 2.4329 - val_accuracy: 0.4433
Epoch 15/200
797/797 [==============================] - 4s 5ms/step - loss: 2.1778 - accuracy: 0.4866 - val_loss: 2.3585 - val_accuracy: 0.4630
Epoch 16/200
797/797 [==============================] - 4s 5ms/step - loss: 2.1288 - accuracy: 0.4931 - val_loss: 2.1973 - val_accuracy: 0.4938
Epoch 17/200
797/797 [==============================] - 4s 5ms/step - loss: 2.0644 - accuracy: 0.5091 - val_loss: 2.2145 - val_accuracy: 0.4863
Epoch 18/200
797/797 [==============================] - 4s 5ms/step - loss: 2.0198 - accuracy: 0.5152 - val_loss: 2.1746 - val_accuracy: 0.4885
Epoch 19/200
797/797 [==============================] - 4s 5ms/step - loss: 1.9589 - accuracy: 0.5297 - val_loss: 2.1483 - val_accuracy: 0.4984
Epoch 20/200
797/797 [==============================] - 4s 5ms/step - loss: 1.8964 - accuracy: 0.5429 - val_loss: 2.6025 - val_accuracy: 0.4133
Epoch 21/200
797/797 [==============================] - 4s 5ms/step - loss: 1.8563 - accuracy: 0.5510 - val_loss: 2.0462 - val_accuracy: 0.5153
Epoch 22/200
797/797 [==============================] - 4s 5ms/step - loss: 1.8285 - accuracy: 0.5594 - val_loss: 2.0941 - val_accuracy: 0.5109
Epoch 23/200
797/797 [==============================] - 4s 5ms/step - loss: 1.7887 - accuracy: 0.5652 - val_loss: 2.0944 - val_accuracy: 0.5089
Epoch 24/200
797/797 [==============================] - 4s 5ms/step - loss: 1.7506 - accuracy: 0.5734 - val_loss: 2.0037 - val_accuracy: 0.5288
Epoch 25/200
797/797 [==============================] - 4s 5ms/step - loss: 1.7172 - accuracy: 0.5829 - val_loss: 2.0470 - val_accuracy: 0.5211
Epoch 26/200
797/797 [==============================] - 4s 5ms/step - loss: 1.6789 - accuracy: 0.5916 - val_loss: 2.1452 - val_accuracy: 0.5047
Epoch 27/200
797/797 [==============================] - 4s 5ms/step - loss: 1.6519 - accuracy: 0.5999 - val_loss: 2.0166 - val_accuracy: 0.5277
Epoch 28/200
797/797 [==============================] - 4s 5ms/step - loss: 1.6234 - accuracy: 0.6062 - val_loss: 2.1115 - val_accuracy: 0.5075
Epoch 29/200
797/797 [==============================] - 4s 5ms/step - loss: 1.6046 - accuracy: 0.6095 - val_loss: 1.9664 - val_accuracy: 0.5388
Epoch 30/200
797/797 [==============================] - 4s 5ms/step - loss: 1.5856 - accuracy: 0.6132 - val_loss: 2.0351 - val_accuracy: 0.5262
Epoch 31/200
797/797 [==============================] - 4s 5ms/step - loss: 1.5403 - accuracy: 0.6215 - val_loss: 2.0887 - val_accuracy: 0.5195
Epoch 32/200
797/797 [==============================] - 4s 5ms/step - loss: 1.5548 - accuracy: 0.6192 - val_loss: 1.9325 - val_accuracy: 0.5519
Epoch 33/200
797/797 [==============================] - 4s 5ms/step - loss: 1.4987 - accuracy: 0.6351 - val_loss: 1.9454 - val_accuracy: 0.5505
Epoch 34/200
797/797 [==============================] - 4s 5ms/step - loss: 1.4730 - accuracy: 0.6403 - val_loss: 2.0707 - val_accuracy: 0.5257
Epoch 35/200
797/797 [==============================] - 4s 5ms/step - loss: 1.4364 - accuracy: 0.6497 - val_loss: 1.9330 - val_accuracy: 0.5543
Epoch 36/200
797/797 [==============================] - 4s 5ms/step - loss: 1.4438 - accuracy: 0.6496 - val_loss: 1.8892 - val_accuracy: 0.5616
Epoch 37/200
797/797 [==============================] - 4s 5ms/step - loss: 1.4262 - accuracy: 0.6537 - val_loss: 1.8743 - val_accuracy: 0.5623
Epoch 38/200
797/797 [==============================] - 4s 5ms/step - loss: 1.4023 - accuracy: 0.6555 - val_loss: 2.0412 - val_accuracy: 0.5312
Epoch 39/200
797/797 [==============================] - 4s 5ms/step - loss: 1.3876 - accuracy: 0.6669 - val_loss: 1.8374 - val_accuracy: 0.5676
Epoch 40/200
797/797 [==============================] - 4s 5ms/step - loss: 1.3738 - accuracy: 0.6691 - val_loss: 1.9247 - val_accuracy: 0.5567
Epoch 41/200
797/797 [==============================] - 4s 5ms/step - loss: 1.3543 - accuracy: 0.6740 - val_loss: 1.8728 - val_accuracy: 0.5660
Epoch 42/200
797/797 [==============================] - 4s 5ms/step - loss: 1.3274 - accuracy: 0.6813 - val_loss: 1.8591 - val_accuracy: 0.5672
Epoch 43/200
797/797 [==============================] - 4s 5ms/step - loss: 1.3144 - accuracy: 0.6851 - val_loss: 1.8737 - val_accuracy: 0.5720
Epoch 44/200
797/797 [==============================] - 4s 5ms/step - loss: 1.3000 - accuracy: 0.6853 - val_loss: 1.8580 - val_accuracy: 0.5767
Epoch 45/200
797/797 [==============================] - 4s 5ms/step - loss: 1.2856 - accuracy: 0.6965 - val_loss: 1.9190 - val_accuracy: 0.5612
Epoch 46/200
797/797 [==============================] - 4s 5ms/step - loss: 1.2760 - accuracy: 0.6910 - val_loss: 1.9817 - val_accuracy: 0.5477
Epoch 47/200
797/797 [==============================] - 4s 5ms/step - loss: 1.2405 - accuracy: 0.7041 - val_loss: 1.7778 - val_accuracy: 0.5933
Epoch 48/200
797/797 [==============================] - 4s 5ms/step - loss: 1.2500 - accuracy: 0.7046 - val_loss: 1.9172 - val_accuracy: 0.5590
Epoch 49/200
797/797 [==============================] - 4s 5ms/step - loss: 1.2324 - accuracy: 0.7065 - val_loss: 1.8846 - val_accuracy: 0.5844
Epoch 50/200
797/797 [==============================] - 4s 5ms/step - loss: 1.2036 - accuracy: 0.7153 - val_loss: 1.8939 - val_accuracy: 0.5705
Epoch 51/200
797/797 [==============================] - 4s 5ms/step - loss: 1.1990 - accuracy: 0.7126 - val_loss: 1.9293 - val_accuracy: 0.5691
Epoch 52/200
797/797 [==============================] - 4s 5ms/step - loss: 1.1925 - accuracy: 0.7170 - val_loss: 1.8853 - val_accuracy: 0.5718
Epoch 53/200
797/797 [==============================] - 4s 5ms/step - loss: 1.1716 - accuracy: 0.7218 - val_loss: 1.8407 - val_accuracy: 0.5853
Epoch 54/200
797/797 [==============================] - 4s 5ms/step - loss: 1.1707 - accuracy: 0.7280 - val_loss: 1.8122 - val_accuracy: 0.5875
Epoch 55/200
797/797 [==============================] - 4s 5ms/step - loss: 1.1524 - accuracy: 0.7322 - val_loss: 1.9100 - val_accuracy: 0.5745
Epoch 56/200
797/797 [==============================] - 4s 5ms/step - loss: 1.1459 - accuracy: 0.7349 - val_loss: 1.8024 - val_accuracy: 0.5926
Epoch 57/200
797/797 [==============================] - 4s 5ms/step - loss: 1.1169 - accuracy: 0.7384 - val_loss: 1.8243 - val_accuracy: 0.5929
Epoch 58/200
797/797 [==============================] - 4s 5ms/step - loss: 1.1250 - accuracy: 0.7352 - val_loss: 1.9220 - val_accuracy: 0.5769
Epoch 59/200
797/797 [==============================] - 4s 5ms/step - loss: 1.1202 - accuracy: 0.7405 - val_loss: 1.8250 - val_accuracy: 0.5966
Epoch 60/200
797/797 [==============================] - 4s 5ms/step - loss: 1.1074 - accuracy: 0.7445 - val_loss: 1.8786 - val_accuracy: 0.5836
Epoch 61/200
797/797 [==============================] - 4s 5ms/step - loss: 1.1052 - accuracy: 0.7394 - val_loss: 1.9423 - val_accuracy: 0.5796
Epoch 62/200
797/797 [==============================] - 4s 5ms/step - loss: 1.0846 - accuracy: 0.7484 - val_loss: 1.8840 - val_accuracy: 0.5900
Epoch 63/200
797/797 [==============================] - 4s 5ms/step - loss: 1.0662 - accuracy: 0.7561 - val_loss: 1.8075 - val_accuracy: 0.6031
Epoch 64/200
797/797 [==============================] - 4s 5ms/step - loss: 1.0489 - accuracy: 0.7605 - val_loss: 2.1800 - val_accuracy: 0.5383
Epoch 65/200
797/797 [==============================] - 4s 5ms/step - loss: 1.0487 - accuracy: 0.7595 - val_loss: 1.8556 - val_accuracy: 0.5931
Epoch 66/200
797/797 [==============================] - 4s 5ms/step - loss: 1.0278 - accuracy: 0.7642 - val_loss: 1.9812 - val_accuracy: 0.5723
Epoch 67/200
797/797 [==============================] - 4s 5ms/step - loss: 1.0415 - accuracy: 0.7625 - val_loss: 1.8383 - val_accuracy: 0.5997
Test set evaluation metrics
---------------------------
Loss:     1.757
Accuracy: 59.508%

Μεταφορά μάθησης

VGG16
In [ ]:
VGG16_MODEL_OPTIMIZED = init_VGG16_model_optimized(True, classes_num = number_of_classes)
accuracies_opt_60["VGG_ALL"] = fit_and_test_model(number_of_classes, VGG16_MODEL_OPTIMIZED, "VGG16")
Model: "sequential_7"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
vgg16 (Functional)           (None, 1, 1, 512)         14714688  
_________________________________________________________________
dropout_16 (Dropout)         (None, 1, 1, 512)         0         
_________________________________________________________________
global_average_pooling2d_4 ( (None, 512)               0         
_________________________________________________________________
dense_10 (Dense)             (None, 60)                30780     
=================================================================
Total params: 14,745,468
Trainable params: 14,745,468
Non-trainable params: 0
_________________________________________________________________
Epoch 1/200
797/797 [==============================] - 16s 19ms/step - loss: 4.0758 - accuracy: 0.0301 - val_loss: 3.3412 - val_accuracy: 0.1784
Epoch 2/200
797/797 [==============================] - 15s 18ms/step - loss: 3.1486 - accuracy: 0.2229 - val_loss: 2.1531 - val_accuracy: 0.4379
Epoch 3/200
797/797 [==============================] - 15s 18ms/step - loss: 2.0955 - accuracy: 0.4485 - val_loss: 1.6851 - val_accuracy: 0.5392
Epoch 4/200
797/797 [==============================] - 15s 18ms/step - loss: 1.5766 - accuracy: 0.5686 - val_loss: 1.4497 - val_accuracy: 0.5997
Epoch 5/200
797/797 [==============================] - 15s 18ms/step - loss: 1.2188 - accuracy: 0.6591 - val_loss: 1.3769 - val_accuracy: 0.6246
Epoch 6/200
797/797 [==============================] - 15s 18ms/step - loss: 0.9683 - accuracy: 0.7181 - val_loss: 1.3466 - val_accuracy: 0.6345
Epoch 7/200
797/797 [==============================] - 15s 18ms/step - loss: 0.7582 - accuracy: 0.7762 - val_loss: 1.3436 - val_accuracy: 0.6463
Epoch 8/200
797/797 [==============================] - 15s 18ms/step - loss: 0.5851 - accuracy: 0.8298 - val_loss: 1.3625 - val_accuracy: 0.6576
Epoch 9/200
797/797 [==============================] - 15s 18ms/step - loss: 0.4487 - accuracy: 0.8630 - val_loss: 1.4071 - val_accuracy: 0.6589
Epoch 10/200
797/797 [==============================] - 15s 18ms/step - loss: 0.3490 - accuracy: 0.8967 - val_loss: 1.5046 - val_accuracy: 0.6523
Epoch 11/200
797/797 [==============================] - 15s 18ms/step - loss: 0.2586 - accuracy: 0.9210 - val_loss: 1.6100 - val_accuracy: 0.6514
Epoch 12/200
797/797 [==============================] - 15s 18ms/step - loss: 0.2147 - accuracy: 0.9339 - val_loss: 1.6201 - val_accuracy: 0.6569
Epoch 13/200
797/797 [==============================] - 15s 18ms/step - loss: 0.1575 - accuracy: 0.9522 - val_loss: 1.7763 - val_accuracy: 0.6598
Epoch 14/200
797/797 [==============================] - 15s 18ms/step - loss: 0.1457 - accuracy: 0.9548 - val_loss: 1.6366 - val_accuracy: 0.6629
Epoch 15/200
797/797 [==============================] - 15s 18ms/step - loss: 0.1345 - accuracy: 0.9587 - val_loss: 1.9396 - val_accuracy: 0.6449
Epoch 16/200
797/797 [==============================] - 15s 18ms/step - loss: 0.1045 - accuracy: 0.9681 - val_loss: 1.8071 - val_accuracy: 0.6640
Epoch 17/200
797/797 [==============================] - 15s 18ms/step - loss: 0.0974 - accuracy: 0.9711 - val_loss: 1.8528 - val_accuracy: 0.6545
Epoch 18/200
797/797 [==============================] - 15s 18ms/step - loss: 0.0959 - accuracy: 0.9729 - val_loss: 1.7981 - val_accuracy: 0.6645
Epoch 19/200
797/797 [==============================] - 15s 18ms/step - loss: 0.0813 - accuracy: 0.9766 - val_loss: 1.9052 - val_accuracy: 0.6591
Epoch 20/200
797/797 [==============================] - 15s 18ms/step - loss: 0.0793 - accuracy: 0.9762 - val_loss: 1.8946 - val_accuracy: 0.6594
Epoch 21/200
797/797 [==============================] - 15s 18ms/step - loss: 0.0925 - accuracy: 0.9722 - val_loss: 1.8979 - val_accuracy: 0.6576
Epoch 22/200
797/797 [==============================] - 15s 18ms/step - loss: 0.0777 - accuracy: 0.9789 - val_loss: 1.8177 - val_accuracy: 0.6565
Epoch 23/200
797/797 [==============================] - 15s 18ms/step - loss: 0.0578 - accuracy: 0.9824 - val_loss: 1.9443 - val_accuracy: 0.6518
Epoch 24/200
797/797 [==============================] - 15s 18ms/step - loss: 0.0712 - accuracy: 0.9796 - val_loss: 1.9804 - val_accuracy: 0.6425
Epoch 25/200
797/797 [==============================] - 15s 18ms/step - loss: 0.0730 - accuracy: 0.9795 - val_loss: 1.9667 - val_accuracy: 0.6662
Epoch 26/200
797/797 [==============================] - 15s 19ms/step - loss: 0.0615 - accuracy: 0.9817 - val_loss: 1.9298 - val_accuracy: 0.6700
Epoch 27/200
797/797 [==============================] - 15s 18ms/step - loss: 0.0494 - accuracy: 0.9859 - val_loss: 2.0016 - val_accuracy: 0.6594
Test set evaluation metrics
---------------------------
Loss:     1.352
Accuracy: 63.331%
MobileNet
In [ ]:
MobileNetV2_MODEL_OPTIMIZED = init_MobileNetV2_model_optimized(True, classes_num = number_of_classes)
accuracies_opt_60["MOBILENET_ALL"] = fit_and_test_model(number_of_classes, MobileNetV2_MODEL_OPTIMIZED, "MobileNet")
Downloading data from https://storage.googleapis.com/tensorflow/keras-applications/mobilenet_v2/mobilenet_v2_weights_tf_dim_ordering_tf_kernels_1.0_224_no_top.h5
9412608/9406464 [==============================] - 0s 0us/step
Model: "sequential_4"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
mobilenetv2_1.00_224 (Functi (None, 7, 7, 1280)        2257984   
_________________________________________________________________
dropout_13 (Dropout)         (None, 7, 7, 1280)        0         
_________________________________________________________________
global_average_pooling2d_1 ( (None, 1280)              0         
_________________________________________________________________
dense_7 (Dense)              (None, 60)                76860     
=================================================================
Total params: 2,334,844
Trainable params: 2,300,732
Non-trainable params: 34,112
_________________________________________________________________
Epoch 1/200
797/797 [==============================] - 145s 177ms/step - loss: 2.3800 - accuracy: 0.4179 - val_loss: 3.3219 - val_accuracy: 0.2398
Epoch 2/200
797/797 [==============================] - 140s 176ms/step - loss: 0.6775 - accuracy: 0.8014 - val_loss: 2.5968 - val_accuracy: 0.3770
Epoch 3/200
797/797 [==============================] - 140s 176ms/step - loss: 0.3756 - accuracy: 0.8860 - val_loss: 2.3121 - val_accuracy: 0.4260
Epoch 4/200
797/797 [==============================] - 140s 176ms/step - loss: 0.2159 - accuracy: 0.9374 - val_loss: 1.0415 - val_accuracy: 0.7272
Epoch 5/200
797/797 [==============================] - 140s 176ms/step - loss: 0.1339 - accuracy: 0.9624 - val_loss: 0.9569 - val_accuracy: 0.7493
Epoch 6/200
797/797 [==============================] - 140s 175ms/step - loss: 0.0975 - accuracy: 0.9735 - val_loss: 0.9559 - val_accuracy: 0.7637
Epoch 7/200
797/797 [==============================] - 140s 176ms/step - loss: 0.0790 - accuracy: 0.9780 - val_loss: 0.9073 - val_accuracy: 0.7841
Epoch 8/200
797/797 [==============================] - 140s 176ms/step - loss: 0.0678 - accuracy: 0.9816 - val_loss: 0.9641 - val_accuracy: 0.7695
Epoch 9/200
797/797 [==============================] - 140s 175ms/step - loss: 0.0673 - accuracy: 0.9812 - val_loss: 1.0563 - val_accuracy: 0.7498
Epoch 10/200
797/797 [==============================] - 139s 175ms/step - loss: 0.0560 - accuracy: 0.9847 - val_loss: 1.0135 - val_accuracy: 0.7682
Epoch 11/200
797/797 [==============================] - 138s 173ms/step - loss: 0.0516 - accuracy: 0.9841 - val_loss: 1.1073 - val_accuracy: 0.7631
Epoch 12/200
797/797 [==============================] - 139s 175ms/step - loss: 0.0435 - accuracy: 0.9877 - val_loss: 1.1528 - val_accuracy: 0.7558
Epoch 13/200
797/797 [==============================] - 139s 175ms/step - loss: 0.0477 - accuracy: 0.9857 - val_loss: 1.1143 - val_accuracy: 0.7606
Epoch 14/200
797/797 [==============================] - 139s 174ms/step - loss: 0.0508 - accuracy: 0.9863 - val_loss: 1.1215 - val_accuracy: 0.7591
Epoch 15/200
797/797 [==============================] - 139s 175ms/step - loss: 0.0410 - accuracy: 0.9874 - val_loss: 1.1689 - val_accuracy: 0.7635
Epoch 16/200
797/797 [==============================] - 139s 175ms/step - loss: 0.0452 - accuracy: 0.9856 - val_loss: 1.2637 - val_accuracy: 0.7584
Epoch 17/200
797/797 [==============================] - 139s 175ms/step - loss: 0.0415 - accuracy: 0.9869 - val_loss: 1.1907 - val_accuracy: 0.7562
Epoch 18/200
797/797 [==============================] - 140s 175ms/step - loss: 0.0323 - accuracy: 0.9902 - val_loss: 1.0428 - val_accuracy: 0.7770
Epoch 19/200
797/797 [==============================] - 139s 175ms/step - loss: 0.0375 - accuracy: 0.9875 - val_loss: 1.1008 - val_accuracy: 0.7832
Epoch 20/200
797/797 [==============================] - 140s 175ms/step - loss: 0.0345 - accuracy: 0.9891 - val_loss: 1.2571 - val_accuracy: 0.7544
Epoch 21/200
797/797 [==============================] - 139s 175ms/step - loss: 0.0377 - accuracy: 0.9886 - val_loss: 1.1729 - val_accuracy: 0.7629
Epoch 22/200
797/797 [==============================] - 138s 174ms/step - loss: 0.0370 - accuracy: 0.9882 - val_loss: 1.2205 - val_accuracy: 0.7595
Epoch 23/200
797/797 [==============================] - 139s 175ms/step - loss: 0.0301 - accuracy: 0.9909 - val_loss: 1.2180 - val_accuracy: 0.7586
Epoch 24/200
797/797 [==============================] - 139s 175ms/step - loss: 0.0329 - accuracy: 0.9890 - val_loss: 1.2179 - val_accuracy: 0.7706
Epoch 25/200
797/797 [==============================] - 139s 175ms/step - loss: 0.0301 - accuracy: 0.9901 - val_loss: 1.0922 - val_accuracy: 0.7695
Epoch 26/200
797/797 [==============================] - 139s 175ms/step - loss: 0.0370 - accuracy: 0.9885 - val_loss: 1.1361 - val_accuracy: 0.7646
Epoch 27/200
797/797 [==============================] - 139s 175ms/step - loss: 0.0310 - accuracy: 0.9903 - val_loss: 1.0764 - val_accuracy: 0.7835
Test set evaluation metrics
---------------------------
Loss:     0.902
Accuracy: 77.693%
DenseNet
In [ ]:
DENSENET_MODEL_OPTIMIZED = init_DENSENET_model_optimized(True, classes_num = number_of_classes)
accuracies_opt_60["DENSENET_ALL"] = fit_and_test_model(number_of_classes, DENSENET_MODEL_OPTIMIZED, "DenseNet")
Downloading data from https://storage.googleapis.com/tensorflow/keras-applications/densenet/densenet121_weights_tf_dim_ordering_tf_kernels_notop.h5
29089792/29084464 [==============================] - 0s 0us/step
Model: "sequential_5"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
densenet121 (Functional)     (None, 1, 1, 1024)        7037504   
_________________________________________________________________
dropout_14 (Dropout)         (None, 1, 1, 1024)        0         
_________________________________________________________________
global_average_pooling2d_2 ( (None, 1024)              0         
_________________________________________________________________
dense_8 (Dense)              (None, 60)                61500     
=================================================================
Total params: 7,099,004
Trainable params: 7,015,356
Non-trainable params: 83,648
_________________________________________________________________
Epoch 1/200
797/797 [==============================] - 37s 35ms/step - loss: 4.4054 - accuracy: 0.0824 - val_loss: 2.3670 - val_accuracy: 0.4005
Epoch 2/200
797/797 [==============================] - 27s 34ms/step - loss: 2.5309 - accuracy: 0.3522 - val_loss: 1.7094 - val_accuracy: 0.5437
Epoch 3/200
797/797 [==============================] - 27s 33ms/step - loss: 1.9136 - accuracy: 0.4922 - val_loss: 1.4777 - val_accuracy: 0.5935
Epoch 4/200
797/797 [==============================] - 27s 34ms/step - loss: 1.5527 - accuracy: 0.5765 - val_loss: 1.3949 - val_accuracy: 0.6139
Epoch 5/200
797/797 [==============================] - 27s 34ms/step - loss: 1.2681 - accuracy: 0.6436 - val_loss: 1.3256 - val_accuracy: 0.6308
Epoch 6/200
797/797 [==============================] - 27s 34ms/step - loss: 1.0669 - accuracy: 0.6928 - val_loss: 1.3479 - val_accuracy: 0.6299
Epoch 7/200
797/797 [==============================] - 27s 33ms/step - loss: 0.9260 - accuracy: 0.7288 - val_loss: 1.3172 - val_accuracy: 0.6489
Epoch 8/200
797/797 [==============================] - 27s 33ms/step - loss: 0.7560 - accuracy: 0.7739 - val_loss: 1.3289 - val_accuracy: 0.6640
Epoch 9/200
797/797 [==============================] - 27s 34ms/step - loss: 0.6134 - accuracy: 0.8180 - val_loss: 1.3823 - val_accuracy: 0.6551
Epoch 10/200
797/797 [==============================] - 27s 34ms/step - loss: 0.5521 - accuracy: 0.8319 - val_loss: 1.4393 - val_accuracy: 0.6489
Epoch 11/200
797/797 [==============================] - 27s 34ms/step - loss: 0.4496 - accuracy: 0.8622 - val_loss: 1.5891 - val_accuracy: 0.6283
Epoch 12/200
797/797 [==============================] - 27s 34ms/step - loss: 0.4139 - accuracy: 0.8718 - val_loss: 1.4388 - val_accuracy: 0.6629
Epoch 13/200
797/797 [==============================] - 27s 34ms/step - loss: 0.3289 - accuracy: 0.8982 - val_loss: 1.4653 - val_accuracy: 0.6651
Epoch 14/200
797/797 [==============================] - 27s 34ms/step - loss: 0.3109 - accuracy: 0.9061 - val_loss: 1.5304 - val_accuracy: 0.6645
Epoch 15/200
797/797 [==============================] - 27s 34ms/step - loss: 0.2655 - accuracy: 0.9165 - val_loss: 1.5606 - val_accuracy: 0.6647
Epoch 16/200
797/797 [==============================] - 27s 34ms/step - loss: 0.2437 - accuracy: 0.9248 - val_loss: 1.5456 - val_accuracy: 0.6662
Epoch 17/200
797/797 [==============================] - 27s 34ms/step - loss: 0.2087 - accuracy: 0.9359 - val_loss: 1.5849 - val_accuracy: 0.6724
Epoch 18/200
797/797 [==============================] - 27s 34ms/step - loss: 0.2001 - accuracy: 0.9378 - val_loss: 1.6980 - val_accuracy: 0.6580
Epoch 19/200
797/797 [==============================] - 27s 34ms/step - loss: 0.1861 - accuracy: 0.9447 - val_loss: 1.6300 - val_accuracy: 0.6707
Epoch 20/200
797/797 [==============================] - 27s 34ms/step - loss: 0.1609 - accuracy: 0.9485 - val_loss: 1.6745 - val_accuracy: 0.6649
Epoch 21/200
797/797 [==============================] - 27s 34ms/step - loss: 0.1640 - accuracy: 0.9488 - val_loss: 1.6788 - val_accuracy: 0.6598
Epoch 22/200
797/797 [==============================] - 27s 34ms/step - loss: 0.1530 - accuracy: 0.9545 - val_loss: 1.6862 - val_accuracy: 0.6527
Epoch 23/200
797/797 [==============================] - 27s 34ms/step - loss: 0.1391 - accuracy: 0.9576 - val_loss: 1.6962 - val_accuracy: 0.6629
Epoch 24/200
797/797 [==============================] - 27s 34ms/step - loss: 0.1400 - accuracy: 0.9578 - val_loss: 1.6786 - val_accuracy: 0.6693
Epoch 25/200
797/797 [==============================] - 27s 34ms/step - loss: 0.1381 - accuracy: 0.9572 - val_loss: 1.6976 - val_accuracy: 0.6715
Epoch 26/200
797/797 [==============================] - 27s 34ms/step - loss: 0.1161 - accuracy: 0.9639 - val_loss: 1.7402 - val_accuracy: 0.6669
Epoch 27/200
797/797 [==============================] - 27s 34ms/step - loss: 0.1250 - accuracy: 0.9627 - val_loss: 1.7890 - val_accuracy: 0.6638
Test set evaluation metrics
---------------------------
Loss:     1.300
Accuracy: 64.777%

Αριθμός κλάσεων = 80

Δίκτυα "from scratch"

In [ ]:
# Number of classes
number_of_classes = 80

accuracies_opt_80 = {}
Simple CNN
In [ ]:
SIMPLE_MODEL_OPTIMIZED = init_simple_model_optimized(summary = True, classes_num = number_of_classes)
accuracies_opt_80["SIMPLE_MODEL"] = fit_and_test_model(number_of_classes, SIMPLE_MODEL_OPTIMIZED, "Simple Model")
Model: "sequential"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
conv2d (Conv2D)              (None, 30, 30, 32)        896       
_________________________________________________________________
batch_normalization (BatchNo (None, 30, 30, 32)        128       
_________________________________________________________________
re_lu (ReLU)                 (None, 30, 30, 32)        0         
_________________________________________________________________
max_pooling2d (MaxPooling2D) (None, 15, 15, 32)        0         
_________________________________________________________________
dropout (Dropout)            (None, 15, 15, 32)        0         
_________________________________________________________________
conv2d_1 (Conv2D)            (None, 13, 13, 64)        18496     
_________________________________________________________________
batch_normalization_1 (Batch (None, 13, 13, 64)        256       
_________________________________________________________________
re_lu_1 (ReLU)               (None, 13, 13, 64)        0         
_________________________________________________________________
max_pooling2d_1 (MaxPooling2 (None, 6, 6, 64)          0         
_________________________________________________________________
dropout_1 (Dropout)          (None, 6, 6, 64)          0         
_________________________________________________________________
conv2d_2 (Conv2D)            (None, 4, 4, 64)          36928     
_________________________________________________________________
batch_normalization_2 (Batch (None, 4, 4, 64)          256       
_________________________________________________________________
re_lu_2 (ReLU)               (None, 4, 4, 64)          0         
_________________________________________________________________
flatten (Flatten)            (None, 1024)              0         
_________________________________________________________________
dropout_2 (Dropout)          (None, 1024)              0         
_________________________________________________________________
dense (Dense)                (None, 64)                65600     
_________________________________________________________________
dense_1 (Dense)              (None, 80)                5200      
=================================================================
Total params: 127,760
Trainable params: 127,440
Non-trainable params: 320
_________________________________________________________________
Epoch 1/200
1063/1063 [==============================] - 6s 4ms/step - loss: 5.4452 - accuracy: 0.0256 - val_loss: 4.8009 - val_accuracy: 0.0791
Epoch 2/200
1063/1063 [==============================] - 4s 4ms/step - loss: 4.6728 - accuracy: 0.0897 - val_loss: 4.2702 - val_accuracy: 0.1316
Epoch 3/200
1063/1063 [==============================] - 4s 4ms/step - loss: 4.1606 - accuracy: 0.1410 - val_loss: 3.9134 - val_accuracy: 0.1634
Epoch 4/200
1063/1063 [==============================] - 4s 4ms/step - loss: 3.8163 - accuracy: 0.1761 - val_loss: 3.5904 - val_accuracy: 0.2138
Epoch 5/200
1063/1063 [==============================] - 4s 4ms/step - loss: 3.5789 - accuracy: 0.2073 - val_loss: 3.4400 - val_accuracy: 0.2274
Epoch 6/200
1063/1063 [==============================] - 4s 4ms/step - loss: 3.3787 - accuracy: 0.2337 - val_loss: 3.3106 - val_accuracy: 0.2455
Epoch 7/200
1063/1063 [==============================] - 4s 4ms/step - loss: 3.2088 - accuracy: 0.2586 - val_loss: 3.1754 - val_accuracy: 0.2699
Epoch 8/200
1063/1063 [==============================] - 4s 4ms/step - loss: 3.0860 - accuracy: 0.2796 - val_loss: 3.0028 - val_accuracy: 0.2937
Epoch 9/200
1063/1063 [==============================] - 4s 4ms/step - loss: 2.9696 - accuracy: 0.3020 - val_loss: 3.0347 - val_accuracy: 0.2911
Epoch 10/200
1063/1063 [==============================] - 4s 4ms/step - loss: 2.8722 - accuracy: 0.3130 - val_loss: 2.7578 - val_accuracy: 0.3388
Epoch 11/200
1063/1063 [==============================] - 4s 4ms/step - loss: 2.8200 - accuracy: 0.3218 - val_loss: 2.8274 - val_accuracy: 0.3175
Epoch 12/200
1063/1063 [==============================] - 4s 4ms/step - loss: 2.7640 - accuracy: 0.3328 - val_loss: 2.7250 - val_accuracy: 0.3403
Epoch 13/200
1063/1063 [==============================] - 4s 4ms/step - loss: 2.7035 - accuracy: 0.3424 - val_loss: 2.9069 - val_accuracy: 0.3077
Epoch 14/200
1063/1063 [==============================] - 4s 4ms/step - loss: 2.6263 - accuracy: 0.3611 - val_loss: 2.6194 - val_accuracy: 0.3599
Epoch 15/200
1063/1063 [==============================] - 4s 4ms/step - loss: 2.5778 - accuracy: 0.3701 - val_loss: 2.6723 - val_accuracy: 0.3559
Epoch 16/200
1063/1063 [==============================] - 4s 4ms/step - loss: 2.5400 - accuracy: 0.3730 - val_loss: 2.5518 - val_accuracy: 0.3748
Epoch 17/200
1063/1063 [==============================] - 4s 4ms/step - loss: 2.5065 - accuracy: 0.3812 - val_loss: 2.4648 - val_accuracy: 0.3951
Epoch 18/200
1063/1063 [==============================] - 4s 4ms/step - loss: 2.4534 - accuracy: 0.3897 - val_loss: 2.4890 - val_accuracy: 0.3851
Epoch 19/200
1063/1063 [==============================] - 4s 4ms/step - loss: 2.4223 - accuracy: 0.3927 - val_loss: 2.5633 - val_accuracy: 0.3742
Epoch 20/200
1063/1063 [==============================] - 4s 4ms/step - loss: 2.3991 - accuracy: 0.3995 - val_loss: 2.4158 - val_accuracy: 0.3983
Epoch 21/200
1063/1063 [==============================] - 4s 4ms/step - loss: 2.3619 - accuracy: 0.4085 - val_loss: 2.4005 - val_accuracy: 0.4036
Epoch 22/200
1063/1063 [==============================] - 4s 4ms/step - loss: 2.3371 - accuracy: 0.4130 - val_loss: 2.6412 - val_accuracy: 0.3639
Epoch 23/200
1063/1063 [==============================] - 4s 4ms/step - loss: 2.3098 - accuracy: 0.4186 - val_loss: 2.3884 - val_accuracy: 0.4051
Epoch 24/200
1063/1063 [==============================] - 4s 4ms/step - loss: 2.2818 - accuracy: 0.4260 - val_loss: 2.3997 - val_accuracy: 0.4111
Epoch 25/200
1063/1063 [==============================] - 4s 4ms/step - loss: 2.2516 - accuracy: 0.4326 - val_loss: 2.3726 - val_accuracy: 0.4174
Epoch 26/200
1063/1063 [==============================] - 4s 4ms/step - loss: 2.2430 - accuracy: 0.4353 - val_loss: 2.2656 - val_accuracy: 0.4365
Epoch 27/200
1063/1063 [==============================] - 4s 4ms/step - loss: 2.2189 - accuracy: 0.4363 - val_loss: 2.3338 - val_accuracy: 0.4189
Epoch 28/200
1063/1063 [==============================] - 4s 4ms/step - loss: 2.2161 - accuracy: 0.4398 - val_loss: 2.3551 - val_accuracy: 0.4149
Epoch 29/200
1063/1063 [==============================] - 4s 4ms/step - loss: 2.1961 - accuracy: 0.4460 - val_loss: 2.4203 - val_accuracy: 0.4101
Epoch 30/200
1063/1063 [==============================] - 4s 4ms/step - loss: 2.1807 - accuracy: 0.4454 - val_loss: 2.3907 - val_accuracy: 0.4112
Epoch 31/200
1063/1063 [==============================] - 4s 4ms/step - loss: 2.1549 - accuracy: 0.4524 - val_loss: 2.3609 - val_accuracy: 0.4229
Epoch 32/200
1063/1063 [==============================] - 4s 4ms/step - loss: 2.1484 - accuracy: 0.4585 - val_loss: 2.2659 - val_accuracy: 0.4370
Epoch 33/200
1063/1063 [==============================] - 4s 4ms/step - loss: 2.1302 - accuracy: 0.4581 - val_loss: 2.3753 - val_accuracy: 0.4072
Epoch 34/200
1063/1063 [==============================] - 4s 4ms/step - loss: 2.1231 - accuracy: 0.4572 - val_loss: 2.2601 - val_accuracy: 0.4403
Epoch 35/200
1063/1063 [==============================] - 4s 4ms/step - loss: 2.0984 - accuracy: 0.4635 - val_loss: 2.2438 - val_accuracy: 0.4403
Epoch 36/200
1063/1063 [==============================] - 4s 4ms/step - loss: 2.0922 - accuracy: 0.4660 - val_loss: 2.2839 - val_accuracy: 0.4300
Epoch 37/200
1063/1063 [==============================] - 4s 4ms/step - loss: 2.0680 - accuracy: 0.4682 - val_loss: 2.2096 - val_accuracy: 0.4551
Epoch 38/200
1063/1063 [==============================] - 4s 4ms/step - loss: 2.0558 - accuracy: 0.4750 - val_loss: 2.3969 - val_accuracy: 0.4227
Epoch 39/200
1063/1063 [==============================] - 4s 4ms/step - loss: 2.0530 - accuracy: 0.4734 - val_loss: 2.1731 - val_accuracy: 0.4545
Epoch 40/200
1063/1063 [==============================] - 4s 4ms/step - loss: 2.0336 - accuracy: 0.4775 - val_loss: 2.3455 - val_accuracy: 0.4272
Epoch 41/200
1063/1063 [==============================] - 4s 4ms/step - loss: 2.0470 - accuracy: 0.4729 - val_loss: 2.3349 - val_accuracy: 0.4260
Epoch 42/200
1063/1063 [==============================] - 4s 4ms/step - loss: 2.0239 - accuracy: 0.4830 - val_loss: 2.2435 - val_accuracy: 0.4453
Epoch 43/200
1063/1063 [==============================] - 4s 4ms/step - loss: 2.0072 - accuracy: 0.4845 - val_loss: 2.1951 - val_accuracy: 0.4530
Epoch 44/200
1063/1063 [==============================] - 4s 4ms/step - loss: 2.0053 - accuracy: 0.4828 - val_loss: 2.2053 - val_accuracy: 0.4538
Epoch 45/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.9879 - accuracy: 0.4902 - val_loss: 2.2651 - val_accuracy: 0.4417
Epoch 46/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.9790 - accuracy: 0.4886 - val_loss: 2.2631 - val_accuracy: 0.4433
Epoch 47/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.9755 - accuracy: 0.4924 - val_loss: 2.1848 - val_accuracy: 0.4561
Epoch 48/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.9661 - accuracy: 0.4955 - val_loss: 2.1536 - val_accuracy: 0.4624
Epoch 49/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.9716 - accuracy: 0.4914 - val_loss: 2.2484 - val_accuracy: 0.4422
Epoch 50/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.9782 - accuracy: 0.4908 - val_loss: 2.2383 - val_accuracy: 0.4503
Epoch 51/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.9343 - accuracy: 0.5000 - val_loss: 2.1519 - val_accuracy: 0.4752
Epoch 52/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.9355 - accuracy: 0.5004 - val_loss: 2.3908 - val_accuracy: 0.4219
Epoch 53/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.9290 - accuracy: 0.5028 - val_loss: 2.3028 - val_accuracy: 0.4357
Epoch 54/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.9304 - accuracy: 0.5044 - val_loss: 2.1296 - val_accuracy: 0.4769
Epoch 55/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.9222 - accuracy: 0.5018 - val_loss: 2.1284 - val_accuracy: 0.4727
Epoch 56/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.9016 - accuracy: 0.5127 - val_loss: 2.0800 - val_accuracy: 0.4759
Epoch 57/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.9126 - accuracy: 0.5024 - val_loss: 2.2310 - val_accuracy: 0.4483
Epoch 58/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.9044 - accuracy: 0.5058 - val_loss: 2.1812 - val_accuracy: 0.4618
Epoch 59/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.8884 - accuracy: 0.5130 - val_loss: 2.3386 - val_accuracy: 0.4307
Epoch 60/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.8838 - accuracy: 0.5122 - val_loss: 2.1275 - val_accuracy: 0.4772
Epoch 61/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.8760 - accuracy: 0.5168 - val_loss: 2.2137 - val_accuracy: 0.4523
Epoch 62/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.8854 - accuracy: 0.5143 - val_loss: 2.2154 - val_accuracy: 0.4584
Epoch 63/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.8647 - accuracy: 0.5175 - val_loss: 2.0921 - val_accuracy: 0.4757
Epoch 64/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.8554 - accuracy: 0.5246 - val_loss: 2.1544 - val_accuracy: 0.4699
Epoch 65/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.8864 - accuracy: 0.5108 - val_loss: 2.0774 - val_accuracy: 0.4852
Epoch 66/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.8761 - accuracy: 0.5160 - val_loss: 2.1702 - val_accuracy: 0.4707
Epoch 67/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.8492 - accuracy: 0.5198 - val_loss: 2.0711 - val_accuracy: 0.4857
Epoch 68/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.8395 - accuracy: 0.5234 - val_loss: 2.1313 - val_accuracy: 0.4734
Epoch 69/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.8505 - accuracy: 0.5248 - val_loss: 2.1125 - val_accuracy: 0.4711
Epoch 70/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.8402 - accuracy: 0.5195 - val_loss: 2.0895 - val_accuracy: 0.4791
Epoch 71/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.8265 - accuracy: 0.5282 - val_loss: 2.0813 - val_accuracy: 0.4797
Epoch 72/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.8216 - accuracy: 0.5272 - val_loss: 2.1516 - val_accuracy: 0.4716
Epoch 73/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.8269 - accuracy: 0.5228 - val_loss: 2.0719 - val_accuracy: 0.4811
Epoch 74/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.8092 - accuracy: 0.5276 - val_loss: 2.0885 - val_accuracy: 0.4804
Epoch 75/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.8076 - accuracy: 0.5341 - val_loss: 2.1331 - val_accuracy: 0.4742
Epoch 76/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.8150 - accuracy: 0.5303 - val_loss: 2.0896 - val_accuracy: 0.4865
Epoch 77/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.7931 - accuracy: 0.5322 - val_loss: 2.1501 - val_accuracy: 0.4726
Epoch 78/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.7962 - accuracy: 0.5346 - val_loss: 2.0151 - val_accuracy: 0.4965
Epoch 79/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.8021 - accuracy: 0.5305 - val_loss: 2.0216 - val_accuracy: 0.4992
Epoch 80/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.7898 - accuracy: 0.5354 - val_loss: 2.0712 - val_accuracy: 0.4879
Epoch 81/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.7912 - accuracy: 0.5346 - val_loss: 2.2136 - val_accuracy: 0.4556
Epoch 82/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.7968 - accuracy: 0.5341 - val_loss: 2.0478 - val_accuracy: 0.4967
Epoch 83/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.7789 - accuracy: 0.5370 - val_loss: 2.1227 - val_accuracy: 0.4797
Epoch 84/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.7762 - accuracy: 0.5354 - val_loss: 2.0219 - val_accuracy: 0.4962
Epoch 85/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.7710 - accuracy: 0.5384 - val_loss: 2.2951 - val_accuracy: 0.4420
Epoch 86/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.7718 - accuracy: 0.5402 - val_loss: 2.1736 - val_accuracy: 0.4691
Epoch 87/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.7598 - accuracy: 0.5408 - val_loss: 2.1534 - val_accuracy: 0.4676
Epoch 88/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.7633 - accuracy: 0.5398 - val_loss: 2.1266 - val_accuracy: 0.4769
Epoch 89/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.7553 - accuracy: 0.5403 - val_loss: 2.2206 - val_accuracy: 0.4541
Epoch 90/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.7703 - accuracy: 0.5429 - val_loss: 2.0680 - val_accuracy: 0.4822
Epoch 91/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.7396 - accuracy: 0.5511 - val_loss: 2.0279 - val_accuracy: 0.4942
Epoch 92/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.7620 - accuracy: 0.5405 - val_loss: 2.1536 - val_accuracy: 0.4669
Epoch 93/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.7585 - accuracy: 0.5383 - val_loss: 2.0953 - val_accuracy: 0.4830
Epoch 94/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.7338 - accuracy: 0.5494 - val_loss: 2.0593 - val_accuracy: 0.4902
Epoch 95/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.7550 - accuracy: 0.5445 - val_loss: 2.1333 - val_accuracy: 0.4756
Epoch 96/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.7542 - accuracy: 0.5458 - val_loss: 2.1007 - val_accuracy: 0.4849
Epoch 97/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.7433 - accuracy: 0.5466 - val_loss: 2.1158 - val_accuracy: 0.4752
Epoch 98/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.7430 - accuracy: 0.5456 - val_loss: 2.2162 - val_accuracy: 0.4569
Test set evaluation metrics
---------------------------
Loss:     1.967
Accuracy: 50.037%
CNN1
In [ ]:
CNN1_MODEL_OPTIMIZED = init_cnn1_model_optimized(summary = True, classes_num = number_of_classes)
accuracies_opt_80["CNN1"] = fit_and_test_model(number_of_classes, CNN1_MODEL_OPTIMIZED, "Cnn1")
Model: "sequential_1"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
conv2d_3 (Conv2D)            (None, 30, 30, 32)        896       
_________________________________________________________________
batch_normalization_3 (Batch (None, 30, 30, 32)        128       
_________________________________________________________________
re_lu_3 (ReLU)               (None, 30, 30, 32)        0         
_________________________________________________________________
max_pooling2d_2 (MaxPooling2 (None, 15, 15, 32)        0         
_________________________________________________________________
dropout_3 (Dropout)          (None, 15, 15, 32)        0         
_________________________________________________________________
conv2d_4 (Conv2D)            (None, 13, 13, 64)        18496     
_________________________________________________________________
batch_normalization_4 (Batch (None, 13, 13, 64)        256       
_________________________________________________________________
re_lu_4 (ReLU)               (None, 13, 13, 64)        0         
_________________________________________________________________
max_pooling2d_3 (MaxPooling2 (None, 6, 6, 64)          0         
_________________________________________________________________
dropout_4 (Dropout)          (None, 6, 6, 64)          0         
_________________________________________________________________
conv2d_5 (Conv2D)            (None, 4, 4, 128)         73856     
_________________________________________________________________
batch_normalization_5 (Batch (None, 4, 4, 128)         512       
_________________________________________________________________
re_lu_5 (ReLU)               (None, 4, 4, 128)         0         
_________________________________________________________________
average_pooling2d (AveragePo (None, 2, 2, 128)         0         
_________________________________________________________________
dropout_5 (Dropout)          (None, 2, 2, 128)         0         
_________________________________________________________________
flatten_1 (Flatten)          (None, 512)               0         
_________________________________________________________________
dense_2 (Dense)              (None, 1024)              525312    
_________________________________________________________________
dropout_6 (Dropout)          (None, 1024)              0         
_________________________________________________________________
dense_3 (Dense)              (None, 80)                82000     
=================================================================
Total params: 701,456
Trainable params: 701,008
Non-trainable params: 448
_________________________________________________________________
Epoch 1/200
1063/1063 [==============================] - 5s 4ms/step - loss: 5.3275 - accuracy: 0.0617 - val_loss: 4.4964 - val_accuracy: 0.1393
Epoch 2/200
1063/1063 [==============================] - 4s 4ms/step - loss: 4.2613 - accuracy: 0.1659 - val_loss: 3.9365 - val_accuracy: 0.2021
Epoch 3/200
1063/1063 [==============================] - 4s 4ms/step - loss: 3.7838 - accuracy: 0.2082 - val_loss: 3.6008 - val_accuracy: 0.2329
Epoch 4/200
1063/1063 [==============================] - 4s 4ms/step - loss: 3.4419 - accuracy: 0.2480 - val_loss: 3.3815 - val_accuracy: 0.2525
Epoch 5/200
1063/1063 [==============================] - 4s 4ms/step - loss: 3.2182 - accuracy: 0.2810 - val_loss: 3.0892 - val_accuracy: 0.3014
Epoch 6/200
1063/1063 [==============================] - 4s 4ms/step - loss: 3.0432 - accuracy: 0.3028 - val_loss: 3.0812 - val_accuracy: 0.2990
Epoch 7/200
1063/1063 [==============================] - 4s 4ms/step - loss: 2.8883 - accuracy: 0.3264 - val_loss: 3.0975 - val_accuracy: 0.2962
Epoch 8/200
1063/1063 [==============================] - 4s 4ms/step - loss: 2.7931 - accuracy: 0.3426 - val_loss: 2.9370 - val_accuracy: 0.3153
Epoch 9/200
1063/1063 [==============================] - 4s 4ms/step - loss: 2.7142 - accuracy: 0.3578 - val_loss: 2.7603 - val_accuracy: 0.3461
Epoch 10/200
1063/1063 [==============================] - 4s 4ms/step - loss: 2.6080 - accuracy: 0.3699 - val_loss: 2.5130 - val_accuracy: 0.4082
Epoch 11/200
1063/1063 [==============================] - 4s 4ms/step - loss: 2.5532 - accuracy: 0.3812 - val_loss: 2.4154 - val_accuracy: 0.4186
Epoch 12/200
1063/1063 [==============================] - 4s 4ms/step - loss: 2.4795 - accuracy: 0.3937 - val_loss: 2.4547 - val_accuracy: 0.4003
Epoch 13/200
1063/1063 [==============================] - 4s 4ms/step - loss: 2.4339 - accuracy: 0.3983 - val_loss: 2.4305 - val_accuracy: 0.4069
Epoch 14/200
1063/1063 [==============================] - 4s 4ms/step - loss: 2.3864 - accuracy: 0.4097 - val_loss: 2.6050 - val_accuracy: 0.3767
Epoch 15/200
1063/1063 [==============================] - 4s 4ms/step - loss: 2.3458 - accuracy: 0.4178 - val_loss: 2.4319 - val_accuracy: 0.4094
Epoch 16/200
1063/1063 [==============================] - 4s 4ms/step - loss: 2.2949 - accuracy: 0.4240 - val_loss: 2.3492 - val_accuracy: 0.4289
Epoch 17/200
1063/1063 [==============================] - 4s 4ms/step - loss: 2.2655 - accuracy: 0.4340 - val_loss: 2.3156 - val_accuracy: 0.4325
Epoch 18/200
1063/1063 [==============================] - 4s 4ms/step - loss: 2.2347 - accuracy: 0.4409 - val_loss: 2.3414 - val_accuracy: 0.4259
Epoch 19/200
1063/1063 [==============================] - 4s 4ms/step - loss: 2.2095 - accuracy: 0.4453 - val_loss: 2.1890 - val_accuracy: 0.4636
Epoch 20/200
1063/1063 [==============================] - 4s 4ms/step - loss: 2.1715 - accuracy: 0.4543 - val_loss: 2.3104 - val_accuracy: 0.4355
Epoch 21/200
1063/1063 [==============================] - 4s 4ms/step - loss: 2.1208 - accuracy: 0.4610 - val_loss: 2.2464 - val_accuracy: 0.4465
Epoch 22/200
1063/1063 [==============================] - 4s 4ms/step - loss: 2.1145 - accuracy: 0.4621 - val_loss: 2.2221 - val_accuracy: 0.4533
Epoch 23/200
1063/1063 [==============================] - 4s 4ms/step - loss: 2.1021 - accuracy: 0.4656 - val_loss: 2.1721 - val_accuracy: 0.4613
Epoch 24/200
1063/1063 [==============================] - 4s 4ms/step - loss: 2.0663 - accuracy: 0.4734 - val_loss: 2.2121 - val_accuracy: 0.4551
Epoch 25/200
1063/1063 [==============================] - 4s 4ms/step - loss: 2.0464 - accuracy: 0.4779 - val_loss: 2.2380 - val_accuracy: 0.4493
Epoch 26/200
1063/1063 [==============================] - 4s 4ms/step - loss: 2.0232 - accuracy: 0.4843 - val_loss: 2.1321 - val_accuracy: 0.4752
Epoch 27/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.9901 - accuracy: 0.4988 - val_loss: 2.2170 - val_accuracy: 0.4555
Epoch 28/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.9808 - accuracy: 0.4876 - val_loss: 2.1113 - val_accuracy: 0.4761
Epoch 29/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.9530 - accuracy: 0.4976 - val_loss: 2.1822 - val_accuracy: 0.4624
Epoch 30/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.9464 - accuracy: 0.4986 - val_loss: 2.0039 - val_accuracy: 0.4968
Epoch 31/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.9349 - accuracy: 0.5019 - val_loss: 2.0654 - val_accuracy: 0.4879
Epoch 32/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.8947 - accuracy: 0.5145 - val_loss: 2.0080 - val_accuracy: 0.4992
Epoch 33/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.8981 - accuracy: 0.5105 - val_loss: 2.0781 - val_accuracy: 0.4842
Epoch 34/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.8837 - accuracy: 0.5161 - val_loss: 2.0464 - val_accuracy: 0.4947
Epoch 35/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.8519 - accuracy: 0.5225 - val_loss: 2.2380 - val_accuracy: 0.4569
Epoch 36/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.8643 - accuracy: 0.5206 - val_loss: 2.1068 - val_accuracy: 0.4842
Epoch 37/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.8351 - accuracy: 0.5277 - val_loss: 2.0535 - val_accuracy: 0.4875
Epoch 38/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.8164 - accuracy: 0.5310 - val_loss: 2.2434 - val_accuracy: 0.4535
Epoch 39/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.8107 - accuracy: 0.5308 - val_loss: 2.0364 - val_accuracy: 0.4980
Epoch 40/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.7784 - accuracy: 0.5381 - val_loss: 2.0288 - val_accuracy: 0.4972
Epoch 41/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.7873 - accuracy: 0.5358 - val_loss: 2.1307 - val_accuracy: 0.4834
Epoch 42/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.7584 - accuracy: 0.5414 - val_loss: 1.9875 - val_accuracy: 0.5068
Epoch 43/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.7641 - accuracy: 0.5393 - val_loss: 2.1378 - val_accuracy: 0.4794
Epoch 44/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.7434 - accuracy: 0.5510 - val_loss: 1.9775 - val_accuracy: 0.5078
Epoch 45/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.7228 - accuracy: 0.5501 - val_loss: 1.9605 - val_accuracy: 0.5105
Epoch 46/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.7163 - accuracy: 0.5512 - val_loss: 2.0208 - val_accuracy: 0.4963
Epoch 47/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.7059 - accuracy: 0.5591 - val_loss: 1.9753 - val_accuracy: 0.5178
Epoch 48/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.7026 - accuracy: 0.5566 - val_loss: 2.0518 - val_accuracy: 0.4938
Epoch 49/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.6784 - accuracy: 0.5642 - val_loss: 1.9586 - val_accuracy: 0.5141
Epoch 50/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.6878 - accuracy: 0.5564 - val_loss: 2.0818 - val_accuracy: 0.4952
Epoch 51/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.6645 - accuracy: 0.5654 - val_loss: 1.9528 - val_accuracy: 0.5191
Epoch 52/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.6481 - accuracy: 0.5681 - val_loss: 1.9985 - val_accuracy: 0.5095
Epoch 53/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.6399 - accuracy: 0.5692 - val_loss: 1.9753 - val_accuracy: 0.5173
Epoch 54/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.6365 - accuracy: 0.5728 - val_loss: 1.9138 - val_accuracy: 0.5283
Epoch 55/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.6351 - accuracy: 0.5712 - val_loss: 2.0528 - val_accuracy: 0.5030
Epoch 56/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.6149 - accuracy: 0.5790 - val_loss: 1.9367 - val_accuracy: 0.5226
Epoch 57/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.6070 - accuracy: 0.5780 - val_loss: 1.9670 - val_accuracy: 0.5165
Epoch 58/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.5946 - accuracy: 0.5883 - val_loss: 1.8845 - val_accuracy: 0.5319
Epoch 59/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.5905 - accuracy: 0.5867 - val_loss: 2.0533 - val_accuracy: 0.5062
Epoch 60/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.5798 - accuracy: 0.5843 - val_loss: 1.9789 - val_accuracy: 0.5121
Epoch 61/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.5738 - accuracy: 0.5902 - val_loss: 2.0118 - val_accuracy: 0.5098
Epoch 62/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.5729 - accuracy: 0.5900 - val_loss: 2.1073 - val_accuracy: 0.4938
Epoch 63/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.5608 - accuracy: 0.5880 - val_loss: 1.9294 - val_accuracy: 0.5256
Epoch 64/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.5545 - accuracy: 0.5874 - val_loss: 2.0106 - val_accuracy: 0.5093
Epoch 65/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.5484 - accuracy: 0.5954 - val_loss: 1.9830 - val_accuracy: 0.5204
Epoch 66/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.5556 - accuracy: 0.5946 - val_loss: 1.9697 - val_accuracy: 0.5214
Epoch 67/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.5341 - accuracy: 0.5946 - val_loss: 1.9900 - val_accuracy: 0.5170
Epoch 68/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.5209 - accuracy: 0.5995 - val_loss: 1.9804 - val_accuracy: 0.5141
Epoch 69/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.5307 - accuracy: 0.5998 - val_loss: 1.9373 - val_accuracy: 0.5218
Epoch 70/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.5171 - accuracy: 0.6050 - val_loss: 1.9628 - val_accuracy: 0.5228
Epoch 71/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.4925 - accuracy: 0.6097 - val_loss: 2.0834 - val_accuracy: 0.4997
Epoch 72/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.4925 - accuracy: 0.6077 - val_loss: 1.9312 - val_accuracy: 0.5316
Epoch 73/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.4859 - accuracy: 0.6114 - val_loss: 1.9756 - val_accuracy: 0.5178
Epoch 74/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.4688 - accuracy: 0.6195 - val_loss: 2.1012 - val_accuracy: 0.4912
Epoch 75/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.4857 - accuracy: 0.6086 - val_loss: 1.9993 - val_accuracy: 0.5133
Epoch 76/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.4721 - accuracy: 0.6104 - val_loss: 1.9719 - val_accuracy: 0.5239
Epoch 77/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.4599 - accuracy: 0.6163 - val_loss: 1.8904 - val_accuracy: 0.5331
Epoch 78/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.4571 - accuracy: 0.6201 - val_loss: 1.8581 - val_accuracy: 0.5512
Epoch 79/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.4488 - accuracy: 0.6216 - val_loss: 2.2220 - val_accuracy: 0.4746
Epoch 80/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.4533 - accuracy: 0.6168 - val_loss: 1.8455 - val_accuracy: 0.5452
Epoch 81/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.4449 - accuracy: 0.6181 - val_loss: 1.9582 - val_accuracy: 0.5288
Epoch 82/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.4404 - accuracy: 0.6193 - val_loss: 1.9322 - val_accuracy: 0.5329
Epoch 83/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.4171 - accuracy: 0.6290 - val_loss: 1.9374 - val_accuracy: 0.5294
Epoch 84/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.4250 - accuracy: 0.6231 - val_loss: 1.9251 - val_accuracy: 0.5293
Epoch 85/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.4176 - accuracy: 0.6296 - val_loss: 1.9554 - val_accuracy: 0.5224
Epoch 86/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.4224 - accuracy: 0.6254 - val_loss: 1.9849 - val_accuracy: 0.5166
Epoch 87/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.3974 - accuracy: 0.6324 - val_loss: 1.9318 - val_accuracy: 0.5281
Epoch 88/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.4015 - accuracy: 0.6331 - val_loss: 1.9279 - val_accuracy: 0.5347
Epoch 89/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.3827 - accuracy: 0.6372 - val_loss: 1.9050 - val_accuracy: 0.5359
Epoch 90/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.3908 - accuracy: 0.6363 - val_loss: 1.8996 - val_accuracy: 0.5396
Epoch 91/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.3677 - accuracy: 0.6396 - val_loss: 1.9404 - val_accuracy: 0.5329
Epoch 92/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.3736 - accuracy: 0.6344 - val_loss: 1.9645 - val_accuracy: 0.5284
Epoch 93/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.3775 - accuracy: 0.6387 - val_loss: 1.9299 - val_accuracy: 0.5359
Epoch 94/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.3660 - accuracy: 0.6432 - val_loss: 1.9767 - val_accuracy: 0.5293
Epoch 95/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.3609 - accuracy: 0.6402 - val_loss: 1.8801 - val_accuracy: 0.5414
Epoch 96/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.3805 - accuracy: 0.6423 - val_loss: 1.8940 - val_accuracy: 0.5477
Epoch 97/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.3447 - accuracy: 0.6503 - val_loss: 1.9956 - val_accuracy: 0.5326
Epoch 98/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.3554 - accuracy: 0.6438 - val_loss: 1.9021 - val_accuracy: 0.5382
Epoch 99/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.3456 - accuracy: 0.6480 - val_loss: 1.8677 - val_accuracy: 0.5426
Epoch 100/200
1063/1063 [==============================] - 4s 4ms/step - loss: 1.3510 - accuracy: 0.6423 - val_loss: 1.8918 - val_accuracy: 0.5482
Test set evaluation metrics
---------------------------
Loss:     1.797
Accuracy: 55.275%
CNN2
In [ ]:
CNN2_MODEL_OPTIMIZED = init_cnn2_model_optimized(summary = True, classes_num = number_of_classes)
accuracies_opt_80["CNN2"] = fit_and_test_model(number_of_classes, CNN2_MODEL_OPTIMIZED, "Cnn2")
Model: "sequential_2"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
conv2d_6 (Conv2D)            (None, 32, 32, 32)        896       
_________________________________________________________________
batch_normalization_6 (Batch (None, 32, 32, 32)        128       
_________________________________________________________________
re_lu_6 (ReLU)               (None, 32, 32, 32)        0         
_________________________________________________________________
max_pooling2d_4 (MaxPooling2 (None, 16, 16, 32)        0         
_________________________________________________________________
dropout_7 (Dropout)          (None, 16, 16, 32)        0         
_________________________________________________________________
conv2d_7 (Conv2D)            (None, 16, 16, 64)        18496     
_________________________________________________________________
batch_normalization_7 (Batch (None, 16, 16, 64)        256       
_________________________________________________________________
re_lu_7 (ReLU)               (None, 16, 16, 64)        0         
_________________________________________________________________
max_pooling2d_5 (MaxPooling2 (None, 8, 8, 64)          0         
_________________________________________________________________
dropout_8 (Dropout)          (None, 8, 8, 64)          0         
_________________________________________________________________
conv2d_8 (Conv2D)            (None, 8, 8, 128)         73856     
_________________________________________________________________
batch_normalization_8 (Batch (None, 8, 8, 128)         512       
_________________________________________________________________
re_lu_8 (ReLU)               (None, 8, 8, 128)         0         
_________________________________________________________________
max_pooling2d_6 (MaxPooling2 (None, 4, 4, 128)         0         
_________________________________________________________________
dropout_9 (Dropout)          (None, 4, 4, 128)         0         
_________________________________________________________________
conv2d_9 (Conv2D)            (None, 4, 4, 256)         295168    
_________________________________________________________________
batch_normalization_9 (Batch (None, 4, 4, 256)         1024      
_________________________________________________________________
re_lu_9 (ReLU)               (None, 4, 4, 256)         0         
_________________________________________________________________
dropout_10 (Dropout)         (None, 4, 4, 256)         0         
_________________________________________________________________
flatten_2 (Flatten)          (None, 4096)              0         
_________________________________________________________________
dense_4 (Dense)              (None, 512)               2097664   
_________________________________________________________________
dropout_11 (Dropout)         (None, 512)               0         
_________________________________________________________________
dense_5 (Dense)              (None, 80)                41040     
=================================================================
Total params: 2,529,040
Trainable params: 2,528,080
Non-trainable params: 960
_________________________________________________________________
Epoch 1/200
1063/1063 [==============================] - 6s 5ms/step - loss: 7.0499 - accuracy: 0.0511 - val_loss: 5.8952 - val_accuracy: 0.1054
Epoch 2/200
1063/1063 [==============================] - 5s 5ms/step - loss: 5.5744 - accuracy: 0.1356 - val_loss: 4.8647 - val_accuracy: 0.1774
Epoch 3/200
1063/1063 [==============================] - 5s 5ms/step - loss: 4.7097 - accuracy: 0.1862 - val_loss: 4.4032 - val_accuracy: 0.1917
Epoch 4/200
1063/1063 [==============================] - 5s 5ms/step - loss: 4.1179 - accuracy: 0.2264 - val_loss: 3.9791 - val_accuracy: 0.2291
Epoch 5/200
1063/1063 [==============================] - 5s 5ms/step - loss: 3.7358 - accuracy: 0.2502 - val_loss: 3.6228 - val_accuracy: 0.2527
Epoch 6/200
1063/1063 [==============================] - 5s 5ms/step - loss: 3.4183 - accuracy: 0.2866 - val_loss: 3.2200 - val_accuracy: 0.3185
Epoch 7/200
1063/1063 [==============================] - 5s 5ms/step - loss: 3.1998 - accuracy: 0.3103 - val_loss: 3.1993 - val_accuracy: 0.3097
Epoch 8/200
1063/1063 [==============================] - 5s 5ms/step - loss: 3.0316 - accuracy: 0.3280 - val_loss: 3.2072 - val_accuracy: 0.2955
Epoch 9/200
1063/1063 [==============================] - 5s 5ms/step - loss: 2.8776 - accuracy: 0.3465 - val_loss: 2.9889 - val_accuracy: 0.3334
Epoch 10/200
1063/1063 [==============================] - 5s 5ms/step - loss: 2.7607 - accuracy: 0.3651 - val_loss: 2.6975 - val_accuracy: 0.3901
Epoch 11/200
1063/1063 [==============================] - 5s 5ms/step - loss: 2.6495 - accuracy: 0.3842 - val_loss: 2.6499 - val_accuracy: 0.3941
Epoch 12/200
1063/1063 [==============================] - 5s 5ms/step - loss: 2.5632 - accuracy: 0.4045 - val_loss: 2.6700 - val_accuracy: 0.3873
Epoch 13/200
1063/1063 [==============================] - 5s 5ms/step - loss: 2.4784 - accuracy: 0.4119 - val_loss: 2.6106 - val_accuracy: 0.4033
Epoch 14/200
1063/1063 [==============================] - 5s 5ms/step - loss: 2.3983 - accuracy: 0.4301 - val_loss: 2.4400 - val_accuracy: 0.4330
Epoch 15/200
1063/1063 [==============================] - 5s 5ms/step - loss: 2.3420 - accuracy: 0.4415 - val_loss: 2.4136 - val_accuracy: 0.4392
Epoch 16/200
1063/1063 [==============================] - 5s 5ms/step - loss: 2.2885 - accuracy: 0.4486 - val_loss: 2.3568 - val_accuracy: 0.4515
Epoch 17/200
1063/1063 [==============================] - 5s 5ms/step - loss: 2.2169 - accuracy: 0.4642 - val_loss: 2.2449 - val_accuracy: 0.4638
Epoch 18/200
1063/1063 [==============================] - 5s 5ms/step - loss: 2.1785 - accuracy: 0.4748 - val_loss: 2.3747 - val_accuracy: 0.4353
Epoch 19/200
1063/1063 [==============================] - 5s 5ms/step - loss: 2.1349 - accuracy: 0.4858 - val_loss: 2.4406 - val_accuracy: 0.4272
Epoch 20/200
1063/1063 [==============================] - 5s 5ms/step - loss: 2.1060 - accuracy: 0.4906 - val_loss: 2.2579 - val_accuracy: 0.4644
Epoch 21/200
1063/1063 [==============================] - 5s 5ms/step - loss: 2.0546 - accuracy: 0.4988 - val_loss: 2.2432 - val_accuracy: 0.4744
Epoch 22/200
1063/1063 [==============================] - 5s 5ms/step - loss: 2.0414 - accuracy: 0.5020 - val_loss: 2.3130 - val_accuracy: 0.4611
Epoch 23/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.9804 - accuracy: 0.5199 - val_loss: 2.1986 - val_accuracy: 0.4811
Epoch 24/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.9646 - accuracy: 0.5161 - val_loss: 2.1330 - val_accuracy: 0.4975
Epoch 25/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.9331 - accuracy: 0.5272 - val_loss: 2.1721 - val_accuracy: 0.4859
Epoch 26/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.9064 - accuracy: 0.5355 - val_loss: 2.2289 - val_accuracy: 0.4797
Epoch 27/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.8631 - accuracy: 0.5457 - val_loss: 2.3095 - val_accuracy: 0.4724
Epoch 28/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.8331 - accuracy: 0.5535 - val_loss: 2.2039 - val_accuracy: 0.4814
Epoch 29/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.8273 - accuracy: 0.5518 - val_loss: 2.1578 - val_accuracy: 0.4907
Epoch 30/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.8032 - accuracy: 0.5633 - val_loss: 2.0728 - val_accuracy: 0.5136
Epoch 31/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.7824 - accuracy: 0.5639 - val_loss: 2.1576 - val_accuracy: 0.4977
Epoch 32/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.7500 - accuracy: 0.5731 - val_loss: 2.0760 - val_accuracy: 0.5115
Epoch 33/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.7279 - accuracy: 0.5796 - val_loss: 2.1051 - val_accuracy: 0.5066
Epoch 34/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.7091 - accuracy: 0.5831 - val_loss: 2.1033 - val_accuracy: 0.5105
Epoch 35/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.6710 - accuracy: 0.5917 - val_loss: 1.9878 - val_accuracy: 0.5384
Epoch 36/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.6828 - accuracy: 0.5922 - val_loss: 2.2230 - val_accuracy: 0.4900
Epoch 37/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.6474 - accuracy: 0.6027 - val_loss: 2.0283 - val_accuracy: 0.5259
Epoch 38/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.6491 - accuracy: 0.6000 - val_loss: 2.0484 - val_accuracy: 0.5284
Epoch 39/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.6268 - accuracy: 0.6040 - val_loss: 2.0132 - val_accuracy: 0.5337
Epoch 40/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.5962 - accuracy: 0.6142 - val_loss: 2.1542 - val_accuracy: 0.5038
Epoch 41/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.6027 - accuracy: 0.6135 - val_loss: 2.1214 - val_accuracy: 0.5130
Epoch 42/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.5694 - accuracy: 0.6215 - val_loss: 2.1361 - val_accuracy: 0.5103
Epoch 43/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.5442 - accuracy: 0.6260 - val_loss: 2.0261 - val_accuracy: 0.5279
Epoch 44/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.5355 - accuracy: 0.6288 - val_loss: 2.1353 - val_accuracy: 0.5080
Epoch 45/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.5269 - accuracy: 0.6341 - val_loss: 2.1280 - val_accuracy: 0.5121
Epoch 46/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.4969 - accuracy: 0.6400 - val_loss: 1.9945 - val_accuracy: 0.5376
Epoch 47/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.4938 - accuracy: 0.6424 - val_loss: 2.0799 - val_accuracy: 0.5268
Epoch 48/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.4736 - accuracy: 0.6489 - val_loss: 2.0332 - val_accuracy: 0.5311
Epoch 49/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.4664 - accuracy: 0.6490 - val_loss: 1.9854 - val_accuracy: 0.5437
Epoch 50/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.4505 - accuracy: 0.6562 - val_loss: 2.0427 - val_accuracy: 0.5361
Epoch 51/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.4442 - accuracy: 0.6532 - val_loss: 2.0199 - val_accuracy: 0.5426
Epoch 52/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.4334 - accuracy: 0.6589 - val_loss: 2.0874 - val_accuracy: 0.5261
Epoch 53/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.4011 - accuracy: 0.6654 - val_loss: 2.2165 - val_accuracy: 0.5043
Epoch 54/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.3981 - accuracy: 0.6662 - val_loss: 2.0349 - val_accuracy: 0.5449
Epoch 55/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.3941 - accuracy: 0.6740 - val_loss: 2.0263 - val_accuracy: 0.5442
Epoch 56/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.3717 - accuracy: 0.6763 - val_loss: 2.0928 - val_accuracy: 0.5278
Epoch 57/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.3713 - accuracy: 0.6803 - val_loss: 2.0182 - val_accuracy: 0.5402
Epoch 58/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.3808 - accuracy: 0.6749 - val_loss: 2.0698 - val_accuracy: 0.5442
Epoch 59/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.3580 - accuracy: 0.6803 - val_loss: 2.1535 - val_accuracy: 0.5231
Epoch 60/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.3550 - accuracy: 0.6855 - val_loss: 2.1303 - val_accuracy: 0.5317
Epoch 61/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.3280 - accuracy: 0.6909 - val_loss: 2.1132 - val_accuracy: 0.5278
Epoch 62/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.3155 - accuracy: 0.6922 - val_loss: 2.0455 - val_accuracy: 0.5421
Epoch 63/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.3070 - accuracy: 0.6959 - val_loss: 2.0471 - val_accuracy: 0.5449
Epoch 64/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.3050 - accuracy: 0.6976 - val_loss: 1.9662 - val_accuracy: 0.5582
Epoch 65/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.2961 - accuracy: 0.6978 - val_loss: 2.0731 - val_accuracy: 0.5427
Epoch 66/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.2895 - accuracy: 0.7049 - val_loss: 2.0740 - val_accuracy: 0.5500
Epoch 67/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.2743 - accuracy: 0.7052 - val_loss: 2.0373 - val_accuracy: 0.5434
Epoch 68/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.2614 - accuracy: 0.7119 - val_loss: 2.1281 - val_accuracy: 0.5369
Epoch 69/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.2626 - accuracy: 0.7058 - val_loss: 2.0516 - val_accuracy: 0.5534
Epoch 70/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.2619 - accuracy: 0.7065 - val_loss: 2.0435 - val_accuracy: 0.5517
Epoch 71/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.2317 - accuracy: 0.7177 - val_loss: 2.0374 - val_accuracy: 0.5457
Epoch 72/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.2290 - accuracy: 0.7200 - val_loss: 2.0799 - val_accuracy: 0.5431
Epoch 73/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.2127 - accuracy: 0.7231 - val_loss: 2.0895 - val_accuracy: 0.5504
Epoch 74/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.2122 - accuracy: 0.7232 - val_loss: 2.2980 - val_accuracy: 0.5130
Epoch 75/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.2209 - accuracy: 0.7223 - val_loss: 2.0947 - val_accuracy: 0.5452
Epoch 76/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.1962 - accuracy: 0.7310 - val_loss: 2.0606 - val_accuracy: 0.5514
Epoch 77/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.2063 - accuracy: 0.7285 - val_loss: 2.0468 - val_accuracy: 0.5547
Epoch 78/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.1868 - accuracy: 0.7337 - val_loss: 2.1367 - val_accuracy: 0.5366
Epoch 79/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.1782 - accuracy: 0.7350 - val_loss: 2.0739 - val_accuracy: 0.5465
Epoch 80/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.1712 - accuracy: 0.7381 - val_loss: 2.1174 - val_accuracy: 0.5442
Epoch 81/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.1705 - accuracy: 0.7336 - val_loss: 2.0633 - val_accuracy: 0.5497
Epoch 82/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.1668 - accuracy: 0.7400 - val_loss: 2.1427 - val_accuracy: 0.5421
Epoch 83/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.1626 - accuracy: 0.7446 - val_loss: 2.1432 - val_accuracy: 0.5440
Epoch 84/200
1063/1063 [==============================] - 5s 5ms/step - loss: 1.1574 - accuracy: 0.7425 - val_loss: 2.0728 - val_accuracy: 0.5560
Test set evaluation metrics
---------------------------
Loss:     1.924
Accuracy: 56.925%

Μεταφορά μάθησης

VGG16
In [ ]:
VGG16_MODEL_OPTIMIZED = init_VGG16_model_optimized(True, classes_num = number_of_classes)
accuracies_opt_80["VGG_ALL"] = fit_and_test_model(number_of_classes, VGG16_MODEL_OPTIMIZED, "VGG16")
Downloading data from https://storage.googleapis.com/tensorflow/keras-applications/vgg16/vgg16_weights_tf_dim_ordering_tf_kernels_notop.h5
58892288/58889256 [==============================] - 0s 0us/step
Model: "sequential"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
vgg16 (Functional)           (None, 1, 1, 512)         14714688  
_________________________________________________________________
dropout (Dropout)            (None, 1, 1, 512)         0         
_________________________________________________________________
global_average_pooling2d (Gl (None, 512)               0         
_________________________________________________________________
dense (Dense)                (None, 80)                41040     
=================================================================
Total params: 14,755,728
Trainable params: 14,755,728
Non-trainable params: 0
_________________________________________________________________
Epoch 1/200
1063/1063 [==============================] - 39s 30ms/step - loss: 4.2591 - accuracy: 0.0446 - val_loss: 3.0785 - val_accuracy: 0.2601
Epoch 2/200
1063/1063 [==============================] - 32s 30ms/step - loss: 2.9030 - accuracy: 0.2870 - val_loss: 2.0717 - val_accuracy: 0.4581
Epoch 3/200
1063/1063 [==============================] - 33s 31ms/step - loss: 2.0730 - accuracy: 0.4579 - val_loss: 1.7420 - val_accuracy: 0.5253
Epoch 4/200
1063/1063 [==============================] - 32s 30ms/step - loss: 1.6464 - accuracy: 0.5532 - val_loss: 1.6244 - val_accuracy: 0.5600
Epoch 5/200
1063/1063 [==============================] - 32s 30ms/step - loss: 1.3495 - accuracy: 0.6257 - val_loss: 1.6503 - val_accuracy: 0.5688
Epoch 6/200
1063/1063 [==============================] - 32s 30ms/step - loss: 1.1027 - accuracy: 0.6885 - val_loss: 1.5687 - val_accuracy: 0.5788
Epoch 7/200
1063/1063 [==============================] - 32s 30ms/step - loss: 0.8880 - accuracy: 0.7392 - val_loss: 1.5819 - val_accuracy: 0.5942
Epoch 8/200
1063/1063 [==============================] - 32s 30ms/step - loss: 0.6939 - accuracy: 0.7963 - val_loss: 1.6367 - val_accuracy: 0.5921
Epoch 9/200
1063/1063 [==============================] - 32s 30ms/step - loss: 0.5319 - accuracy: 0.8393 - val_loss: 1.6757 - val_accuracy: 0.5916
Epoch 10/200
1063/1063 [==============================] - 32s 30ms/step - loss: 0.4087 - accuracy: 0.8768 - val_loss: 1.8399 - val_accuracy: 0.5951
Epoch 11/200
1063/1063 [==============================] - 32s 30ms/step - loss: 0.3267 - accuracy: 0.9013 - val_loss: 1.8996 - val_accuracy: 0.6027
Epoch 12/200
1063/1063 [==============================] - 32s 30ms/step - loss: 0.2516 - accuracy: 0.9224 - val_loss: 1.8591 - val_accuracy: 0.6129
Epoch 13/200
1063/1063 [==============================] - 32s 30ms/step - loss: 0.2061 - accuracy: 0.9355 - val_loss: 1.9820 - val_accuracy: 0.6097
Epoch 14/200
1063/1063 [==============================] - 32s 30ms/step - loss: 0.1668 - accuracy: 0.9479 - val_loss: 2.1610 - val_accuracy: 0.5944
Epoch 15/200
1063/1063 [==============================] - 32s 30ms/step - loss: 0.1445 - accuracy: 0.9562 - val_loss: 2.1108 - val_accuracy: 0.6107
Epoch 16/200
1063/1063 [==============================] - 32s 30ms/step - loss: 0.1289 - accuracy: 0.9624 - val_loss: 2.2570 - val_accuracy: 0.5891
Epoch 17/200
1063/1063 [==============================] - 32s 30ms/step - loss: 0.1019 - accuracy: 0.9673 - val_loss: 2.3008 - val_accuracy: 0.5946
Epoch 18/200
1063/1063 [==============================] - 32s 30ms/step - loss: 0.0975 - accuracy: 0.9721 - val_loss: 2.2994 - val_accuracy: 0.6145
Epoch 19/200
1063/1063 [==============================] - 32s 30ms/step - loss: 0.0855 - accuracy: 0.9730 - val_loss: 2.2945 - val_accuracy: 0.6052
Epoch 20/200
1063/1063 [==============================] - 32s 30ms/step - loss: 0.0866 - accuracy: 0.9746 - val_loss: 2.2665 - val_accuracy: 0.5961
Epoch 21/200
1063/1063 [==============================] - 32s 30ms/step - loss: 0.0836 - accuracy: 0.9748 - val_loss: 2.3269 - val_accuracy: 0.6046
Epoch 22/200
1063/1063 [==============================] - 32s 30ms/step - loss: 0.0827 - accuracy: 0.9748 - val_loss: 2.2981 - val_accuracy: 0.6027
Epoch 23/200
1063/1063 [==============================] - 32s 30ms/step - loss: 0.0724 - accuracy: 0.9792 - val_loss: 2.2865 - val_accuracy: 0.6016
Epoch 24/200
1063/1063 [==============================] - 32s 30ms/step - loss: 0.0738 - accuracy: 0.9778 - val_loss: 2.3412 - val_accuracy: 0.5947
Epoch 25/200
1063/1063 [==============================] - 32s 30ms/step - loss: 0.0681 - accuracy: 0.9786 - val_loss: 2.3214 - val_accuracy: 0.6099
Epoch 26/200
1063/1063 [==============================] - 32s 30ms/step - loss: 0.0688 - accuracy: 0.9804 - val_loss: 2.4391 - val_accuracy: 0.5966
Test set evaluation metrics
---------------------------
Loss:     1.547
Accuracy: 58.613%
MobileNet
In [ ]:
MobileNetV2_MODEL_OPTIMIZED = init_MobileNetV2_model_optimized(True, classes_num = number_of_classes)
accuracies_opt_80["MOBILENET_ALL"] = fit_and_test_model(number_of_classes, MobileNetV2_MODEL_OPTIMIZED, "MobileNet")
Model: "sequential_3"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
mobilenetv2_1.00_224 (Functi (None, 7, 7, 1280)        2257984   
_________________________________________________________________
dropout_12 (Dropout)         (None, 7, 7, 1280)        0         
_________________________________________________________________
global_average_pooling2d (Gl (None, 1280)              0         
_________________________________________________________________
dense_6 (Dense)              (None, 80)                102480    
=================================================================
Total params: 2,360,464
Trainable params: 2,326,352
Non-trainable params: 34,112
_________________________________________________________________
Epoch 1/200
1063/1063 [==============================] - 194s 179ms/step - loss: 2.5463 - accuracy: 0.3868 - val_loss: 3.3312 - val_accuracy: 0.2663
Epoch 2/200
1063/1063 [==============================] - 186s 175ms/step - loss: 0.7990 - accuracy: 0.7656 - val_loss: 2.9843 - val_accuracy: 0.3210
Epoch 3/200
1063/1063 [==============================] - 187s 176ms/step - loss: 0.4757 - accuracy: 0.8572 - val_loss: 1.4196 - val_accuracy: 0.6188
Epoch 4/200
1063/1063 [==============================] - 189s 177ms/step - loss: 0.2876 - accuracy: 0.9135 - val_loss: 0.9997 - val_accuracy: 0.7317
Epoch 5/200
1063/1063 [==============================] - 189s 177ms/step - loss: 0.1762 - accuracy: 0.9501 - val_loss: 0.9327 - val_accuracy: 0.7552
Epoch 6/200
1063/1063 [==============================] - 189s 178ms/step - loss: 0.1341 - accuracy: 0.9625 - val_loss: 0.9810 - val_accuracy: 0.7430
Epoch 7/200
1063/1063 [==============================] - 189s 178ms/step - loss: 0.1088 - accuracy: 0.9694 - val_loss: 1.1382 - val_accuracy: 0.7399
Epoch 8/200
1063/1063 [==============================] - 189s 178ms/step - loss: 0.0876 - accuracy: 0.9758 - val_loss: 1.1429 - val_accuracy: 0.7447
Epoch 9/200
1063/1063 [==============================] - 185s 174ms/step - loss: 0.0859 - accuracy: 0.9741 - val_loss: 1.1945 - val_accuracy: 0.7269
Epoch 10/200
1063/1063 [==============================] - 189s 178ms/step - loss: 0.0738 - accuracy: 0.9782 - val_loss: 1.0186 - val_accuracy: 0.7475
Epoch 11/200
1063/1063 [==============================] - 186s 175ms/step - loss: 0.0655 - accuracy: 0.9802 - val_loss: 1.1241 - val_accuracy: 0.7404
Epoch 12/200
1063/1063 [==============================] - 187s 176ms/step - loss: 0.0605 - accuracy: 0.9838 - val_loss: 1.1335 - val_accuracy: 0.7598
Epoch 13/200
1063/1063 [==============================] - 189s 178ms/step - loss: 0.0577 - accuracy: 0.9838 - val_loss: 1.3899 - val_accuracy: 0.7239
Epoch 14/200
1063/1063 [==============================] - 189s 178ms/step - loss: 0.0569 - accuracy: 0.9818 - val_loss: 1.3336 - val_accuracy: 0.7291
Epoch 15/200
1063/1063 [==============================] - 187s 176ms/step - loss: 0.0522 - accuracy: 0.9839 - val_loss: 1.2374 - val_accuracy: 0.7452
Epoch 16/200
1063/1063 [==============================] - 187s 176ms/step - loss: 0.0454 - accuracy: 0.9866 - val_loss: 1.2745 - val_accuracy: 0.7402
Epoch 17/200
1063/1063 [==============================] - 187s 176ms/step - loss: 0.0512 - accuracy: 0.9846 - val_loss: 1.1575 - val_accuracy: 0.7658
Epoch 18/200
1063/1063 [==============================] - 189s 178ms/step - loss: 0.0465 - accuracy: 0.9854 - val_loss: 1.3131 - val_accuracy: 0.7414
Epoch 19/200
1063/1063 [==============================] - 189s 178ms/step - loss: 0.0462 - accuracy: 0.9862 - val_loss: 1.3185 - val_accuracy: 0.7382
Epoch 20/200
1063/1063 [==============================] - 186s 175ms/step - loss: 0.0460 - accuracy: 0.9853 - val_loss: 1.1497 - val_accuracy: 0.7576
Epoch 21/200
1063/1063 [==============================] - 187s 176ms/step - loss: 0.0438 - accuracy: 0.9852 - val_loss: 1.2445 - val_accuracy: 0.7405
Epoch 22/200
1063/1063 [==============================] - 189s 178ms/step - loss: 0.0413 - accuracy: 0.9866 - val_loss: 1.2310 - val_accuracy: 0.7483
Epoch 23/200
1063/1063 [==============================] - 187s 176ms/step - loss: 0.0333 - accuracy: 0.9901 - val_loss: 1.3972 - val_accuracy: 0.7357
Epoch 24/200
1063/1063 [==============================] - 189s 178ms/step - loss: 0.0423 - accuracy: 0.9867 - val_loss: 1.3882 - val_accuracy: 0.7342
Epoch 25/200
1063/1063 [==============================] - 188s 177ms/step - loss: 0.0398 - accuracy: 0.9877 - val_loss: 1.4748 - val_accuracy: 0.7349
Test set evaluation metrics
---------------------------
Loss:     0.915
Accuracy: 75.600%
DenseNet
In [ ]:
DENSENET_MODEL_OPTIMIZED = init_DENSENET_model_optimized(True, classes_num = number_of_classes)
accuracies_opt_80["DENSENET_ALL"] = fit_and_test_model(number_of_classes, DENSENET_MODEL_OPTIMIZED, "DenseNet")
Model: "sequential_4"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
densenet121 (Functional)     (None, 1, 1, 1024)        7037504   
_________________________________________________________________
dropout_13 (Dropout)         (None, 1, 1, 1024)        0         
_________________________________________________________________
global_average_pooling2d_1 ( (None, 1024)              0         
_________________________________________________________________
dense_7 (Dense)              (None, 80)                82000     
=================================================================
Total params: 7,119,504
Trainable params: 7,035,856
Non-trainable params: 83,648
_________________________________________________________________
Epoch 1/200
1063/1063 [==============================] - 46s 35ms/step - loss: 4.6690 - accuracy: 0.0708 - val_loss: 2.5858 - val_accuracy: 0.3620
Epoch 2/200
1063/1063 [==============================] - 36s 34ms/step - loss: 2.8120 - accuracy: 0.3114 - val_loss: 2.0021 - val_accuracy: 0.4754
Epoch 3/200
1063/1063 [==============================] - 36s 33ms/step - loss: 2.1335 - accuracy: 0.4423 - val_loss: 1.7548 - val_accuracy: 0.5273
Epoch 4/200
1063/1063 [==============================] - 36s 33ms/step - loss: 1.7667 - accuracy: 0.5262 - val_loss: 1.6459 - val_accuracy: 0.5575
Epoch 5/200
1063/1063 [==============================] - 36s 34ms/step - loss: 1.4777 - accuracy: 0.5897 - val_loss: 1.6686 - val_accuracy: 0.5494
Epoch 6/200
1063/1063 [==============================] - 35s 33ms/step - loss: 1.2698 - accuracy: 0.6403 - val_loss: 1.5521 - val_accuracy: 0.5873
Epoch 7/200
1063/1063 [==============================] - 36s 34ms/step - loss: 1.0328 - accuracy: 0.7037 - val_loss: 1.4722 - val_accuracy: 0.6082
Epoch 8/200
1063/1063 [==============================] - 36s 33ms/step - loss: 0.8974 - accuracy: 0.7378 - val_loss: 1.4763 - val_accuracy: 0.6169
Epoch 9/200
1063/1063 [==============================] - 36s 33ms/step - loss: 0.7668 - accuracy: 0.7739 - val_loss: 1.5683 - val_accuracy: 0.6139
Epoch 10/200
1063/1063 [==============================] - 36s 34ms/step - loss: 0.6358 - accuracy: 0.8123 - val_loss: 1.5923 - val_accuracy: 0.6065
Epoch 11/200
1063/1063 [==============================] - 36s 34ms/step - loss: 0.5306 - accuracy: 0.8410 - val_loss: 1.6156 - val_accuracy: 0.6137
Epoch 12/200
1063/1063 [==============================] - 36s 33ms/step - loss: 0.4504 - accuracy: 0.8635 - val_loss: 1.6715 - val_accuracy: 0.6193
Epoch 13/200
1063/1063 [==============================] - 35s 33ms/step - loss: 0.3985 - accuracy: 0.8798 - val_loss: 1.7454 - val_accuracy: 0.6135
Epoch 14/200
1063/1063 [==============================] - 36s 33ms/step - loss: 0.3451 - accuracy: 0.8948 - val_loss: 1.8252 - val_accuracy: 0.6092
Epoch 15/200
1063/1063 [==============================] - 36s 34ms/step - loss: 0.3060 - accuracy: 0.9060 - val_loss: 1.7724 - val_accuracy: 0.6232
Epoch 16/200
1063/1063 [==============================] - 36s 33ms/step - loss: 0.2756 - accuracy: 0.9160 - val_loss: 1.8917 - val_accuracy: 0.6095
Epoch 17/200
1063/1063 [==============================] - 36s 33ms/step - loss: 0.2525 - accuracy: 0.9214 - val_loss: 1.8529 - val_accuracy: 0.6248
Epoch 18/200
1063/1063 [==============================] - 36s 34ms/step - loss: 0.2480 - accuracy: 0.9256 - val_loss: 1.9169 - val_accuracy: 0.6155
Epoch 19/200
1063/1063 [==============================] - 36s 34ms/step - loss: 0.2161 - accuracy: 0.9334 - val_loss: 1.9925 - val_accuracy: 0.6070
Epoch 20/200
1063/1063 [==============================] - 36s 34ms/step - loss: 0.2046 - accuracy: 0.9367 - val_loss: 2.0150 - val_accuracy: 0.6130
Epoch 21/200
1063/1063 [==============================] - 36s 34ms/step - loss: 0.1840 - accuracy: 0.9435 - val_loss: 2.0179 - val_accuracy: 0.6187
Epoch 22/200
1063/1063 [==============================] - 36s 34ms/step - loss: 0.1955 - accuracy: 0.9393 - val_loss: 1.9959 - val_accuracy: 0.6164
Epoch 23/200
1063/1063 [==============================] - 36s 34ms/step - loss: 0.1711 - accuracy: 0.9460 - val_loss: 2.0602 - val_accuracy: 0.6099
Epoch 24/200
1063/1063 [==============================] - 36s 34ms/step - loss: 0.1597 - accuracy: 0.9514 - val_loss: 1.9711 - val_accuracy: 0.6283
Epoch 25/200
1063/1063 [==============================] - 36s 34ms/step - loss: 0.1571 - accuracy: 0.9502 - val_loss: 2.0448 - val_accuracy: 0.6184
Epoch 26/200
1063/1063 [==============================] - 36s 34ms/step - loss: 0.1540 - accuracy: 0.9529 - val_loss: 2.0521 - val_accuracy: 0.6149
Epoch 27/200
1063/1063 [==============================] - 36s 34ms/step - loss: 0.1486 - accuracy: 0.9516 - val_loss: 1.9977 - val_accuracy: 0.6184
Test set evaluation metrics
---------------------------
Loss:     1.443
Accuracy: 61.387%

Σύγκριση χρόνων εκπαίδευσης βέλτιστων ταξινομητών

Τυπώνουμε τώρα σε φθίνουσα σειρά τους χρόνους εκπαίδευσης των βελτιστοποιημένων μοντέλων για 80 κλάσεις.

In [ ]:
print("\033[1mFit times of Optimal Classifiers for 80 Classes:\n")
print("------------------------ fit times ------------------------\n")
sorted_fit_times = [(k, fit_times[k]) for k in sorted(fit_times, key=fit_times.get, reverse=True)]
for k, v in sorted_fit_times:
    hours, mins, secs = str(datetime.timedelta(seconds=v)).split(":")
    if int(hours)==0:
       print("\033[1m",k,":","\033[0m{} mins {} secs".format(mins.lstrip("0"),np.round(float(secs))))
    else:
       print("\033[1m",k,":","\033[0m{} h {} mins {} secs".format(hours.lstrip("0"),mins.lstrip("0"),np.round(float(secs))))
print()
Fit times of Optimal Classifiers for 80 Classes:

------------------------ fit times ------------------------

 MobileNet : 1 h 18 mins 22.0 secs
 DenseNet : 16 mins 14.0 secs
 VGG16 : 14 mins 7.0 secs
 Cnn2 : 7 mins 25.0 secs
 Cnn1 : 7 mins 16.0 secs
 Simple Model : 6 mins 53.0 secs

Παρατηρούμε πως τα Transfer Learning μοντέλα έχουν γενικά υψηλότερους χρόνους εκπαίδευσης από τα from scratch. Το γεγονός αυτό είναι αναμενόμενο καθώς τα μοντέλα αυτά έχουν αισθητά μεγαλύτερο αριθμό παραμέτρων προς εκπαίδευση.

Bar plots σύγκρισης

In [ ]:
# set width of bar
barWidth = 0.15
model_names = ['Simple Model', 'CNN1', 'CNN2', 'VGG16', 'MobileNet', 'DenseNet']

# set height of bars
bar1 = [accuracies_opt["SIMPLE_MODEL"],accuracies_opt["CNN1"],accuracies_opt["CNN2"],accuracies_opt["VGG_ALL"],accuracies_opt["MOBILENET_ALL"],accuracies_opt["DENSENET_ALL"]]
bar2 = [accuracies_opt_40["SIMPLE_MODEL"],accuracies_opt_40["CNN1"],accuracies_opt_40["CNN2"],accuracies_opt_40["VGG_ALL"],accuracies_opt_40["MOBILENET_ALL"],accuracies_opt_40["DENSENET_ALL"]]
bar3 = [accuracies_opt_60["SIMPLE_MODEL"],accuracies_opt_60["CNN1"],accuracies_opt_60["CNN2"],accuracies_opt_60["VGG_ALL"],accuracies_opt_60["MOBILENET_ALL"],accuracies_opt_60["DENSENET_ALL"]]
bar4 = [accuracies_opt_80["SIMPLE_MODEL"],accuracies_opt_80["CNN1"],accuracies_opt_80["CNN2"],accuracies_opt_80["VGG_ALL"],accuracies_opt_80["MOBILENET_ALL"],accuracies_opt_80["DENSENET_ALL"]]

# Set position of bar on X axis
r1 = np.arange(6)
r2 = [x + barWidth for x in r1]
r3 = [x + barWidth for x in r2]
r4 = [x + barWidth for x in r3]


plt.figure(figsize=(12,5))
plt.bar(r1, bar1, color='#003f5c', width=barWidth, edgecolor='white', label = '20')
plt.bar(r2, bar2, color='#ffa600', width=barWidth, edgecolor='white', label = '40')
plt.bar(r3, bar3, color='#bc5090', width=barWidth, edgecolor='white', label = '60')
plt.bar(r4, bar4, color='#25A640', width=barWidth, edgecolor='white', label = '80')
plt.xticks([r + 1.5*barWidth for r in range(6)], model_names)
plt.ylim(bottom=0.1)
plt.legend(loc='best')
plt.title("Experiments on Number of Classes")
plt.ylabel("Classification Accuracy")
plt.grid(axis="y", linestyle="--")
plt.show()

Παρατηρούμε πως όσο αυξάνεται ο αριθμός των κλάσεων (και παράλληλα και τα δεδομένα μας) η ακρίβεια κατηγοριοποίησης μειώνεται, γεγονός αναμενόμενο καθώς έχουμε να αντιμετώπισουμε ένα αρκετά πιο δύσκολο πρόβλημα κατηγοριοποίησης. Παρατηρούμε λοιπόν πως για το υποπρόβλημα των 20 κλάσεων, ακόμα και τα from scratch μοντέλα που υλοποιούμε, επιτυγχάνουν ποσοστό ακρίβειας γύρω στο 70% (τα μοντέλα CNN1 και CNN2 μάλιστα το υπερβαίνουν). Αντίστοιχα, με χρήση μεταφοράς μάθησης, βλέπουμε πως λαμβάνουμε ποσοστό ορθής κατηγοριοποίησης κοντά στο 90% (MobileNet). Τα νούμερα αυτά φθείνουν καθώς προχωράμε σε μεγαλύτερο αριθμό κλάσεων. Συγκεκριμένα, υπάρχει μια αισθητή πτώση στην απόδοση, της τάξης του 10%, καθώς οι κλάσεις διπλασιάζονται και από 20 γίνονται 40. Για περαιτέρω αύξηση του πλήθους των κλάσεων, η μείωση της απόδοσης είναι μικρότερη. Αυτό καταδεικνύει την κλιμακωσιμότητα (scalability) των μοντέλων μας, τα οποία προσαρμόζονται επιτυχημένα στην αύξηση του όγκου των δεδομένων. Αξίζει να σημειωθεί πως για το πρόβλημα των 80 κλάσεων, με χρήση του pretrained δικτύου MobileNet και με εκπαίδευση όλων των επιπέδων του, λαμβάνουμε ακρίβεια στα test δεδομένα η οποία υπερβαίνει το 75%, ποσοστό αρκετά υψηλό αν αναλογιστεί κανείς τη δυσκολία του συγκεκριμένου classification task.

Επίδραση της απόδοσης με μεταβολή του αριθμού του batch size

Μέχρι στιγμής εκπαιδεύαμε τα μοντέλα μας με batch size ίσο με 32. Δοκιμάζουμε τώρα να εκπαιδεύσουμε τα βελτιστοποιημένα μοντέλα μας με batch size 64, 128 και 200, ώστε να δούμε πως η αύξηση αυτή επηρεάζει την ακρίβεια των μοντέλων (test accuracy). Ο αριθμός των κλάσεων διατηρείται σταθερός και ίσος με 20.

Batch size = 64

In [ ]:
# Number of classes
num_of_classes = 20

# select the number of classes
cifar100_classes_url = select_classes_number(num_of_classes)

Δημιουργούμε το μοναδικό dataset της ομάδας μας:

In [ ]:
team_classes = pd.read_csv(cifar100_classes_url, sep=',', header=None)
CIFAR100_LABELS_LIST = pd.read_csv('https://pastebin.com/raw/qgDaNggt', sep=',', header=None).astype(str).values.tolist()[0]

our_index = team_classes.iloc[team_seed,:].values.tolist()
print(our_index)
our_classes = select_from_list(CIFAR100_LABELS_LIST, our_index)
train_index = get_ds_index(y_train_all, our_index)
test_index = get_ds_index(y_test_all, our_index)

x_train_ds = np.asarray(select_from_list(x_train_all, train_index))
y_train_ds = np.asarray(select_from_list(y_train_all, train_index))
x_test_ds = np.asarray(select_from_list(x_test_all, test_index))
y_test_ds = np.asarray(select_from_list(y_test_all, test_index))
[1, 6, 9, 19, 25, 26, 27, 29, 32, 33, 39, 42, 53, 68, 79, 86, 87, 88, 91, 98]
In [ ]:
# print our classes
print(our_classes)
[' aquarium_fish', ' bee', ' bottle', ' cattle', ' couch', ' crab', ' crocodile', ' dinosaur', ' flatfish', ' forest', ' keyboard', ' leopard', ' orange', ' road', ' spider', ' telephone', ' television', ' tiger', ' trout', ' woman']
In [ ]:
CLASSES_NUM=len(our_classes)
print(CLASSES_NUM)
20
In [ ]:
# get (train) dataset dimensions
data_size, img_rows, img_cols, img_channels = x_train_ds.shape

# set validation set percentage (wrt the training set size)
validation_percentage = 0.15
val_size = round(validation_percentage * data_size)

# Reserve val_size samples for validation and normalize all values
x_val = x_train_ds[-val_size:]/255
y_val = y_train_ds[-val_size:]
x_train = x_train_ds[:-val_size]/255
y_train = y_train_ds[:-val_size]
x_test = x_test_ds/255
y_test = y_test_ds
In [ ]:
y_train = create_new_labels(our_index,y_train)
y_val = create_new_labels(our_index,y_val)
y_test = create_new_labels(our_index,y_test)
In [ ]:
BATCH_SIZE = 64

def _input_fn(x,y, BATCH_SIZE):
  ds = tf.data.Dataset.from_tensor_slices((x,y))
  ds = ds.shuffle(buffer_size=data_size)
  ds = ds.repeat()
  ds = ds.batch(BATCH_SIZE)
  ds = ds.prefetch(buffer_size=AUTOTUNE)
  return ds

train_ds =_input_fn(x_train,y_train, BATCH_SIZE) #PrefetchDataset object
validation_ds =_input_fn(x_val,y_val, BATCH_SIZE) #PrefetchDataset object
test_ds =_input_fn(x_test,y_test, BATCH_SIZE) #PrefetchDataset object

train_ds_res = train_ds.map(resize_transform)
validation_ds_res = validation_ds.map(resize_transform)
test_ds_res = test_ds.map(resize_transform)

def train_model(model, train_dataset = train_ds, validation_dataset = validation_ds, epochs = 100, callbacks = None, steps_per_epoch = int(np.ceil(x_train.shape[0]/BATCH_SIZE)), validation_steps = int(np.ceil(x_val.shape[0]/BATCH_SIZE))):
  history = model.fit(train_dataset, epochs=epochs, steps_per_epoch=steps_per_epoch, validation_data=validation_dataset, validation_steps=validation_steps, callbacks=callbacks)
  return(history)

def model_report(model, history, evaluation_dataset = test_ds, evaluation_steps = int(np.ceil(x_test.shape[0]/BATCH_SIZE))):
      plt = summarize_diagnostics(history)
      plt.show()
      return model_evaluation(model, evaluation_dataset, evaluation_steps)

Δίκτυα "from scratch"

In [ ]:
accuracies_opt_64 = {}
Simple CNN
In [ ]:
SIMPLE_MODEL_OPTIMIZED = init_simple_model_optimized(summary = True)
SIMPLE_MODEL_OPTIMIZED_history = train_model(SIMPLE_MODEL_OPTIMIZED, epochs = 200, callbacks=[callback])
Model: "sequential"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
conv2d (Conv2D)              (None, 30, 30, 32)        896       
_________________________________________________________________
batch_normalization (BatchNo (None, 30, 30, 32)        128       
_________________________________________________________________
re_lu (ReLU)                 (None, 30, 30, 32)        0         
_________________________________________________________________
max_pooling2d (MaxPooling2D) (None, 15, 15, 32)        0         
_________________________________________________________________
dropout (Dropout)            (None, 15, 15, 32)        0         
_________________________________________________________________
conv2d_1 (Conv2D)            (None, 13, 13, 64)        18496     
_________________________________________________________________
batch_normalization_1 (Batch (None, 13, 13, 64)        256       
_________________________________________________________________
re_lu_1 (ReLU)               (None, 13, 13, 64)        0         
_________________________________________________________________
max_pooling2d_1 (MaxPooling2 (None, 6, 6, 64)          0         
_________________________________________________________________
dropout_1 (Dropout)          (None, 6, 6, 64)          0         
_________________________________________________________________
conv2d_2 (Conv2D)            (None, 4, 4, 64)          36928     
_________________________________________________________________
batch_normalization_2 (Batch (None, 4, 4, 64)          256       
_________________________________________________________________
re_lu_2 (ReLU)               (None, 4, 4, 64)          0         
_________________________________________________________________
flatten (Flatten)            (None, 1024)              0         
_________________________________________________________________
dropout_2 (Dropout)          (None, 1024)              0         
_________________________________________________________________
dense (Dense)                (None, 64)                65600     
_________________________________________________________________
dense_1 (Dense)              (None, 20)                1300      
=================================================================
Total params: 123,860
Trainable params: 123,540
Non-trainable params: 320
_________________________________________________________________
Epoch 1/200
133/133 [==============================] - 8s 9ms/step - loss: 4.2563 - accuracy: 0.0692 - val_loss: 4.1803 - val_accuracy: 0.0508
Epoch 2/200
133/133 [==============================] - 1s 6ms/step - loss: 3.9056 - accuracy: 0.1321 - val_loss: 4.3211 - val_accuracy: 0.0540
Epoch 3/200
133/133 [==============================] - 1s 5ms/step - loss: 3.6881 - accuracy: 0.1952 - val_loss: 4.1011 - val_accuracy: 0.1185
Epoch 4/200
133/133 [==============================] - 1s 5ms/step - loss: 3.5366 - accuracy: 0.2289 - val_loss: 3.6950 - val_accuracy: 0.1953
Epoch 5/200
133/133 [==============================] - 1s 6ms/step - loss: 3.3792 - accuracy: 0.2699 - val_loss: 3.5421 - val_accuracy: 0.2292
Epoch 6/200
133/133 [==============================] - 1s 6ms/step - loss: 3.2313 - accuracy: 0.2988 - val_loss: 3.1982 - val_accuracy: 0.3099
Epoch 7/200
133/133 [==============================] - 1s 5ms/step - loss: 3.0833 - accuracy: 0.3275 - val_loss: 3.0453 - val_accuracy: 0.3535
Epoch 8/200
133/133 [==============================] - 1s 5ms/step - loss: 2.9744 - accuracy: 0.3518 - val_loss: 3.0858 - val_accuracy: 0.3333
Epoch 9/200
133/133 [==============================] - 1s 6ms/step - loss: 2.8447 - accuracy: 0.3812 - val_loss: 2.9912 - val_accuracy: 0.3509
Epoch 10/200
133/133 [==============================] - 1s 6ms/step - loss: 2.7767 - accuracy: 0.3888 - val_loss: 3.0046 - val_accuracy: 0.3483
Epoch 11/200
133/133 [==============================] - 1s 6ms/step - loss: 2.6888 - accuracy: 0.4173 - val_loss: 2.9211 - val_accuracy: 0.3691
Epoch 12/200
133/133 [==============================] - 1s 6ms/step - loss: 2.5773 - accuracy: 0.4393 - val_loss: 2.9237 - val_accuracy: 0.3613
Epoch 13/200
133/133 [==============================] - 1s 6ms/step - loss: 2.5336 - accuracy: 0.4376 - val_loss: 2.7839 - val_accuracy: 0.3939
Epoch 14/200
133/133 [==============================] - 1s 6ms/step - loss: 2.4575 - accuracy: 0.4507 - val_loss: 2.8788 - val_accuracy: 0.3678
Epoch 15/200
133/133 [==============================] - 1s 6ms/step - loss: 2.4135 - accuracy: 0.4612 - val_loss: 2.5489 - val_accuracy: 0.4408
Epoch 16/200
133/133 [==============================] - 1s 5ms/step - loss: 2.3572 - accuracy: 0.4825 - val_loss: 2.5396 - val_accuracy: 0.4427
Epoch 17/200
133/133 [==============================] - 1s 5ms/step - loss: 2.2860 - accuracy: 0.4901 - val_loss: 2.4556 - val_accuracy: 0.4564
Epoch 18/200
133/133 [==============================] - 1s 6ms/step - loss: 2.2604 - accuracy: 0.4949 - val_loss: 2.7936 - val_accuracy: 0.3783
Epoch 19/200
133/133 [==============================] - 1s 6ms/step - loss: 2.1683 - accuracy: 0.5011 - val_loss: 2.7464 - val_accuracy: 0.3796
Epoch 20/200
133/133 [==============================] - 1s 5ms/step - loss: 2.1592 - accuracy: 0.5125 - val_loss: 2.5532 - val_accuracy: 0.4128
Epoch 21/200
133/133 [==============================] - 1s 5ms/step - loss: 2.0989 - accuracy: 0.5114 - val_loss: 2.4447 - val_accuracy: 0.4368
Epoch 22/200
133/133 [==============================] - 1s 5ms/step - loss: 2.0393 - accuracy: 0.5378 - val_loss: 2.2946 - val_accuracy: 0.4766
Epoch 23/200
133/133 [==============================] - 1s 5ms/step - loss: 2.0248 - accuracy: 0.5347 - val_loss: 2.1341 - val_accuracy: 0.5013
Epoch 24/200
133/133 [==============================] - 1s 6ms/step - loss: 1.9791 - accuracy: 0.5412 - val_loss: 2.2528 - val_accuracy: 0.4811
Epoch 25/200
133/133 [==============================] - 1s 6ms/step - loss: 1.9317 - accuracy: 0.5486 - val_loss: 2.2231 - val_accuracy: 0.4818
Epoch 26/200
133/133 [==============================] - 1s 6ms/step - loss: 1.9180 - accuracy: 0.5518 - val_loss: 2.1092 - val_accuracy: 0.5085
Epoch 27/200
133/133 [==============================] - 1s 6ms/step - loss: 1.8688 - accuracy: 0.5617 - val_loss: 2.2862 - val_accuracy: 0.4583
Epoch 28/200
133/133 [==============================] - 1s 6ms/step - loss: 1.8451 - accuracy: 0.5729 - val_loss: 2.0987 - val_accuracy: 0.4980
Epoch 29/200
133/133 [==============================] - 1s 6ms/step - loss: 1.8094 - accuracy: 0.5734 - val_loss: 2.1221 - val_accuracy: 0.5000
Epoch 30/200
133/133 [==============================] - 1s 6ms/step - loss: 1.7782 - accuracy: 0.5793 - val_loss: 2.1093 - val_accuracy: 0.4974
Epoch 31/200
133/133 [==============================] - 1s 6ms/step - loss: 1.7603 - accuracy: 0.5760 - val_loss: 2.0290 - val_accuracy: 0.5176
Epoch 32/200
133/133 [==============================] - 1s 6ms/step - loss: 1.6865 - accuracy: 0.5999 - val_loss: 1.9536 - val_accuracy: 0.5254
Epoch 33/200
133/133 [==============================] - 1s 6ms/step - loss: 1.7050 - accuracy: 0.5979 - val_loss: 2.1850 - val_accuracy: 0.4772
Epoch 34/200
133/133 [==============================] - 1s 6ms/step - loss: 1.6588 - accuracy: 0.5937 - val_loss: 1.8200 - val_accuracy: 0.5677
Epoch 35/200
133/133 [==============================] - 1s 5ms/step - loss: 1.6307 - accuracy: 0.6150 - val_loss: 2.0968 - val_accuracy: 0.4974
Epoch 36/200
133/133 [==============================] - 1s 6ms/step - loss: 1.6280 - accuracy: 0.6055 - val_loss: 1.8985 - val_accuracy: 0.5508
Epoch 37/200
133/133 [==============================] - 1s 6ms/step - loss: 1.6030 - accuracy: 0.6147 - val_loss: 1.8614 - val_accuracy: 0.5508
Epoch 38/200
133/133 [==============================] - 1s 6ms/step - loss: 1.5747 - accuracy: 0.6146 - val_loss: 1.8990 - val_accuracy: 0.5384
Epoch 39/200
133/133 [==============================] - 1s 6ms/step - loss: 1.5307 - accuracy: 0.6338 - val_loss: 2.0300 - val_accuracy: 0.4987
Epoch 40/200
133/133 [==============================] - 1s 6ms/step - loss: 1.5078 - accuracy: 0.6394 - val_loss: 1.8766 - val_accuracy: 0.5488
Epoch 41/200
133/133 [==============================] - 1s 6ms/step - loss: 1.4950 - accuracy: 0.6393 - val_loss: 1.7324 - val_accuracy: 0.5690
Epoch 42/200
133/133 [==============================] - 1s 6ms/step - loss: 1.4852 - accuracy: 0.6375 - val_loss: 2.0600 - val_accuracy: 0.5065
Epoch 43/200
133/133 [==============================] - 1s 5ms/step - loss: 1.4481 - accuracy: 0.6472 - val_loss: 1.8118 - val_accuracy: 0.5592
Epoch 44/200
133/133 [==============================] - 1s 6ms/step - loss: 1.4521 - accuracy: 0.6483 - val_loss: 1.8413 - val_accuracy: 0.5501
Epoch 45/200
133/133 [==============================] - 1s 6ms/step - loss: 1.4126 - accuracy: 0.6555 - val_loss: 2.0901 - val_accuracy: 0.5026
Epoch 46/200
133/133 [==============================] - 1s 6ms/step - loss: 1.4131 - accuracy: 0.6483 - val_loss: 1.7467 - val_accuracy: 0.5612
Epoch 47/200
133/133 [==============================] - 1s 6ms/step - loss: 1.4006 - accuracy: 0.6564 - val_loss: 1.6912 - val_accuracy: 0.5911
Epoch 48/200
133/133 [==============================] - 1s 6ms/step - loss: 1.3566 - accuracy: 0.6599 - val_loss: 1.6253 - val_accuracy: 0.5938
Epoch 49/200
133/133 [==============================] - 1s 6ms/step - loss: 1.3460 - accuracy: 0.6667 - val_loss: 1.6611 - val_accuracy: 0.5911
Epoch 50/200
133/133 [==============================] - 1s 6ms/step - loss: 1.3297 - accuracy: 0.6655 - val_loss: 1.6240 - val_accuracy: 0.5918
Epoch 51/200
133/133 [==============================] - 1s 6ms/step - loss: 1.2896 - accuracy: 0.6827 - val_loss: 1.7440 - val_accuracy: 0.5729
Epoch 52/200
133/133 [==============================] - 1s 6ms/step - loss: 1.2840 - accuracy: 0.6804 - val_loss: 1.5911 - val_accuracy: 0.6022
Epoch 53/200
133/133 [==============================] - 1s 5ms/step - loss: 1.2932 - accuracy: 0.6706 - val_loss: 1.5989 - val_accuracy: 0.5944
Epoch 54/200
133/133 [==============================] - 1s 5ms/step - loss: 1.2609 - accuracy: 0.6909 - val_loss: 1.6687 - val_accuracy: 0.5742
Epoch 55/200
133/133 [==============================] - 1s 5ms/step - loss: 1.2655 - accuracy: 0.6822 - val_loss: 1.6197 - val_accuracy: 0.5977
Epoch 56/200
133/133 [==============================] - 1s 5ms/step - loss: 1.2505 - accuracy: 0.6933 - val_loss: 1.6418 - val_accuracy: 0.5885
Epoch 57/200
133/133 [==============================] - 1s 6ms/step - loss: 1.2300 - accuracy: 0.6842 - val_loss: 1.5343 - val_accuracy: 0.6146
Epoch 58/200
133/133 [==============================] - 1s 6ms/step - loss: 1.2321 - accuracy: 0.6965 - val_loss: 1.6582 - val_accuracy: 0.5814
Epoch 59/200
133/133 [==============================] - 1s 6ms/step - loss: 1.1961 - accuracy: 0.7025 - val_loss: 1.5476 - val_accuracy: 0.6172
Epoch 60/200
133/133 [==============================] - 1s 5ms/step - loss: 1.1985 - accuracy: 0.6947 - val_loss: 1.5944 - val_accuracy: 0.5977
Epoch 61/200
133/133 [==============================] - 1s 6ms/step - loss: 1.1462 - accuracy: 0.7181 - val_loss: 1.5831 - val_accuracy: 0.6094
Epoch 62/200
133/133 [==============================] - 1s 6ms/step - loss: 1.1744 - accuracy: 0.7045 - val_loss: 1.5303 - val_accuracy: 0.6087
Epoch 63/200
133/133 [==============================] - 1s 6ms/step - loss: 1.1459 - accuracy: 0.7069 - val_loss: 1.4843 - val_accuracy: 0.6217
Epoch 64/200
133/133 [==============================] - 1s 6ms/step - loss: 1.1439 - accuracy: 0.7152 - val_loss: 1.5234 - val_accuracy: 0.6237
Epoch 65/200
133/133 [==============================] - 1s 6ms/step - loss: 1.1242 - accuracy: 0.7167 - val_loss: 1.5615 - val_accuracy: 0.6087
Epoch 66/200
133/133 [==============================] - 1s 6ms/step - loss: 1.1358 - accuracy: 0.7076 - val_loss: 1.4899 - val_accuracy: 0.6328
Epoch 67/200
133/133 [==============================] - 1s 5ms/step - loss: 1.1097 - accuracy: 0.7163 - val_loss: 1.4593 - val_accuracy: 0.6309
Epoch 68/200
133/133 [==============================] - 1s 6ms/step - loss: 1.0816 - accuracy: 0.7219 - val_loss: 1.4980 - val_accuracy: 0.6172
Epoch 69/200
133/133 [==============================] - 1s 6ms/step - loss: 1.0908 - accuracy: 0.7235 - val_loss: 1.4326 - val_accuracy: 0.6400
Epoch 70/200
133/133 [==============================] - 1s 6ms/step - loss: 1.0922 - accuracy: 0.7171 - val_loss: 1.5773 - val_accuracy: 0.6120
Epoch 71/200
133/133 [==============================] - 1s 6ms/step - loss: 1.0586 - accuracy: 0.7316 - val_loss: 1.4514 - val_accuracy: 0.6341
Epoch 72/200
133/133 [==============================] - 1s 6ms/step - loss: 1.0764 - accuracy: 0.7234 - val_loss: 1.6940 - val_accuracy: 0.5775
Epoch 73/200
133/133 [==============================] - 1s 6ms/step - loss: 1.0559 - accuracy: 0.7344 - val_loss: 1.3704 - val_accuracy: 0.6536
Epoch 74/200
133/133 [==============================] - 1s 6ms/step - loss: 1.0342 - accuracy: 0.7326 - val_loss: 1.5402 - val_accuracy: 0.6100
Epoch 75/200
133/133 [==============================] - 1s 6ms/step - loss: 1.0350 - accuracy: 0.7326 - val_loss: 1.4235 - val_accuracy: 0.6348
Epoch 76/200
133/133 [==============================] - 1s 5ms/step - loss: 0.9906 - accuracy: 0.7460 - val_loss: 1.5391 - val_accuracy: 0.6100
Epoch 77/200
133/133 [==============================] - 1s 6ms/step - loss: 0.9977 - accuracy: 0.7421 - val_loss: 1.7469 - val_accuracy: 0.5671
Epoch 78/200
133/133 [==============================] - 1s 5ms/step - loss: 1.0146 - accuracy: 0.7363 - val_loss: 1.5735 - val_accuracy: 0.6100
Epoch 79/200
133/133 [==============================] - 1s 6ms/step - loss: 0.9785 - accuracy: 0.7483 - val_loss: 1.3937 - val_accuracy: 0.6445
Epoch 80/200
133/133 [==============================] - 1s 6ms/step - loss: 0.9848 - accuracy: 0.7482 - val_loss: 1.6820 - val_accuracy: 0.5892
Epoch 81/200
133/133 [==============================] - 1s 6ms/step - loss: 0.9723 - accuracy: 0.7414 - val_loss: 1.4827 - val_accuracy: 0.6309
Epoch 82/200
133/133 [==============================] - 1s 6ms/step - loss: 0.9919 - accuracy: 0.7382 - val_loss: 1.3423 - val_accuracy: 0.6615
Epoch 83/200
133/133 [==============================] - 1s 6ms/step - loss: 0.9663 - accuracy: 0.7447 - val_loss: 1.4844 - val_accuracy: 0.6257
Epoch 84/200
133/133 [==============================] - 1s 6ms/step - loss: 0.9492 - accuracy: 0.7547 - val_loss: 1.4318 - val_accuracy: 0.6380
Epoch 85/200
133/133 [==============================] - 1s 6ms/step - loss: 0.9321 - accuracy: 0.7614 - val_loss: 1.4057 - val_accuracy: 0.6374
Epoch 86/200
133/133 [==============================] - 1s 6ms/step - loss: 0.9334 - accuracy: 0.7603 - val_loss: 1.4004 - val_accuracy: 0.6458
Epoch 87/200
133/133 [==============================] - 1s 6ms/step - loss: 0.9373 - accuracy: 0.7522 - val_loss: 1.3790 - val_accuracy: 0.6504
Epoch 88/200
133/133 [==============================] - 1s 6ms/step - loss: 0.9047 - accuracy: 0.7615 - val_loss: 1.4224 - val_accuracy: 0.6419
Epoch 89/200
133/133 [==============================] - 1s 6ms/step - loss: 0.8964 - accuracy: 0.7639 - val_loss: 1.3562 - val_accuracy: 0.6569
Epoch 90/200
133/133 [==============================] - 1s 6ms/step - loss: 0.9007 - accuracy: 0.7654 - val_loss: 1.3686 - val_accuracy: 0.6556
Epoch 91/200
133/133 [==============================] - 1s 6ms/step - loss: 0.8979 - accuracy: 0.7712 - val_loss: 1.4563 - val_accuracy: 0.6341
Epoch 92/200
133/133 [==============================] - 1s 6ms/step - loss: 0.8707 - accuracy: 0.7749 - val_loss: 1.4031 - val_accuracy: 0.6484
Epoch 93/200
133/133 [==============================] - 1s 6ms/step - loss: 0.8664 - accuracy: 0.7757 - val_loss: 1.5358 - val_accuracy: 0.6120
Epoch 94/200
133/133 [==============================] - 1s 6ms/step - loss: 0.8842 - accuracy: 0.7668 - val_loss: 1.4286 - val_accuracy: 0.6322
Epoch 95/200
133/133 [==============================] - 1s 5ms/step - loss: 0.8513 - accuracy: 0.7795 - val_loss: 1.4563 - val_accuracy: 0.6341
Epoch 96/200
133/133 [==============================] - 1s 5ms/step - loss: 0.8738 - accuracy: 0.7671 - val_loss: 1.3771 - val_accuracy: 0.6426
Epoch 97/200
133/133 [==============================] - 1s 6ms/step - loss: 0.8678 - accuracy: 0.7714 - val_loss: 1.3489 - val_accuracy: 0.6576
Epoch 98/200
133/133 [==============================] - 1s 6ms/step - loss: 0.8522 - accuracy: 0.7686 - val_loss: 1.3341 - val_accuracy: 0.6634
Epoch 99/200
133/133 [==============================] - 1s 6ms/step - loss: 0.8324 - accuracy: 0.7837 - val_loss: 1.3956 - val_accuracy: 0.6458
Epoch 100/200
133/133 [==============================] - 1s 5ms/step - loss: 0.8316 - accuracy: 0.7807 - val_loss: 1.4564 - val_accuracy: 0.6289
Epoch 101/200
133/133 [==============================] - 1s 5ms/step - loss: 0.8370 - accuracy: 0.7810 - val_loss: 1.2952 - val_accuracy: 0.6615
Epoch 102/200
133/133 [==============================] - 1s 6ms/step - loss: 0.8222 - accuracy: 0.7841 - val_loss: 1.3571 - val_accuracy: 0.6660
Epoch 103/200
133/133 [==============================] - 1s 6ms/step - loss: 0.7960 - accuracy: 0.7909 - val_loss: 1.3677 - val_accuracy: 0.6504
Epoch 104/200
133/133 [==============================] - 1s 5ms/step - loss: 0.8121 - accuracy: 0.7803 - val_loss: 1.2839 - val_accuracy: 0.6725
Epoch 105/200
133/133 [==============================] - 1s 5ms/step - loss: 0.7815 - accuracy: 0.7946 - val_loss: 1.3911 - val_accuracy: 0.6484
Epoch 106/200
133/133 [==============================] - 1s 6ms/step - loss: 0.7983 - accuracy: 0.7809 - val_loss: 1.2639 - val_accuracy: 0.6719
Epoch 107/200
133/133 [==============================] - 1s 6ms/step - loss: 0.7890 - accuracy: 0.7911 - val_loss: 1.3427 - val_accuracy: 0.6582
Epoch 108/200
133/133 [==============================] - 1s 6ms/step - loss: 0.7709 - accuracy: 0.7963 - val_loss: 1.3405 - val_accuracy: 0.6576
Epoch 109/200
133/133 [==============================] - 1s 6ms/step - loss: 0.7890 - accuracy: 0.7910 - val_loss: 1.4152 - val_accuracy: 0.6426
Epoch 110/200
133/133 [==============================] - 1s 6ms/step - loss: 0.7935 - accuracy: 0.7884 - val_loss: 1.2594 - val_accuracy: 0.6829
Epoch 111/200
133/133 [==============================] - 1s 6ms/step - loss: 0.7667 - accuracy: 0.7995 - val_loss: 1.2816 - val_accuracy: 0.6758
Epoch 112/200
133/133 [==============================] - 1s 6ms/step - loss: 0.7888 - accuracy: 0.7896 - val_loss: 1.3706 - val_accuracy: 0.6497
Epoch 113/200
133/133 [==============================] - 1s 6ms/step - loss: 0.7705 - accuracy: 0.8005 - val_loss: 1.2622 - val_accuracy: 0.6758
Epoch 114/200
133/133 [==============================] - 1s 6ms/step - loss: 0.7596 - accuracy: 0.8064 - val_loss: 1.2931 - val_accuracy: 0.6732
Epoch 115/200
133/133 [==============================] - 1s 6ms/step - loss: 0.7530 - accuracy: 0.8028 - val_loss: 1.3326 - val_accuracy: 0.6549
Epoch 116/200
133/133 [==============================] - 1s 6ms/step - loss: 0.7637 - accuracy: 0.7944 - val_loss: 1.2715 - val_accuracy: 0.6745
Epoch 117/200
133/133 [==============================] - 1s 6ms/step - loss: 0.7551 - accuracy: 0.7970 - val_loss: 1.3924 - val_accuracy: 0.6413
Epoch 118/200
133/133 [==============================] - 1s 6ms/step - loss: 0.7387 - accuracy: 0.8049 - val_loss: 1.2791 - val_accuracy: 0.6673
Epoch 119/200
133/133 [==============================] - 1s 6ms/step - loss: 0.7358 - accuracy: 0.8019 - val_loss: 1.3555 - val_accuracy: 0.6582
Epoch 120/200
133/133 [==============================] - 1s 6ms/step - loss: 0.7127 - accuracy: 0.8088 - val_loss: 1.2858 - val_accuracy: 0.6706
Epoch 121/200
133/133 [==============================] - 1s 6ms/step - loss: 0.7004 - accuracy: 0.8120 - val_loss: 1.3437 - val_accuracy: 0.6686
Epoch 122/200
133/133 [==============================] - 1s 6ms/step - loss: 0.7366 - accuracy: 0.8066 - val_loss: 1.2876 - val_accuracy: 0.6803
Epoch 123/200
133/133 [==============================] - 1s 5ms/step - loss: 0.6937 - accuracy: 0.8177 - val_loss: 1.3332 - val_accuracy: 0.6634
Epoch 124/200
133/133 [==============================] - 1s 6ms/step - loss: 0.7198 - accuracy: 0.8096 - val_loss: 1.3056 - val_accuracy: 0.6699
Epoch 125/200
133/133 [==============================] - 1s 5ms/step - loss: 0.7094 - accuracy: 0.8124 - val_loss: 1.2961 - val_accuracy: 0.6706
Epoch 126/200
133/133 [==============================] - 1s 6ms/step - loss: 0.7002 - accuracy: 0.8123 - val_loss: 1.3124 - val_accuracy: 0.6699
Epoch 127/200
133/133 [==============================] - 1s 6ms/step - loss: 0.6986 - accuracy: 0.8150 - val_loss: 1.3113 - val_accuracy: 0.6641
Epoch 128/200
133/133 [==============================] - 1s 6ms/step - loss: 0.7010 - accuracy: 0.8162 - val_loss: 1.3559 - val_accuracy: 0.6510
Epoch 129/200
133/133 [==============================] - 1s 6ms/step - loss: 0.7014 - accuracy: 0.8143 - val_loss: 1.2336 - val_accuracy: 0.6849
Epoch 130/200
133/133 [==============================] - 1s 6ms/step - loss: 0.6929 - accuracy: 0.8093 - val_loss: 1.3042 - val_accuracy: 0.6699
Epoch 131/200
133/133 [==============================] - 1s 6ms/step - loss: 0.6814 - accuracy: 0.8221 - val_loss: 1.2402 - val_accuracy: 0.6751
Epoch 132/200
133/133 [==============================] - 1s 6ms/step - loss: 0.6750 - accuracy: 0.8179 - val_loss: 1.3547 - val_accuracy: 0.6693
Epoch 133/200
133/133 [==============================] - 1s 5ms/step - loss: 0.6549 - accuracy: 0.8279 - val_loss: 1.2869 - val_accuracy: 0.6628
Epoch 134/200
133/133 [==============================] - 1s 6ms/step - loss: 0.6623 - accuracy: 0.8226 - val_loss: 1.2839 - val_accuracy: 0.6745
Epoch 135/200
133/133 [==============================] - 1s 6ms/step - loss: 0.6631 - accuracy: 0.8274 - val_loss: 1.3453 - val_accuracy: 0.6595
Epoch 136/200
133/133 [==============================] - 1s 6ms/step - loss: 0.6744 - accuracy: 0.8179 - val_loss: 1.4505 - val_accuracy: 0.6387
Epoch 137/200
133/133 [==============================] - 1s 6ms/step - loss: 0.6736 - accuracy: 0.8166 - val_loss: 1.2504 - val_accuracy: 0.6888
Epoch 138/200
133/133 [==============================] - 1s 6ms/step - loss: 0.6331 - accuracy: 0.8332 - val_loss: 1.3373 - val_accuracy: 0.6738
Epoch 139/200
133/133 [==============================] - 1s 6ms/step - loss: 0.6558 - accuracy: 0.8249 - val_loss: 1.2363 - val_accuracy: 0.6842
Epoch 140/200
133/133 [==============================] - 1s 6ms/step - loss: 0.6585 - accuracy: 0.8312 - val_loss: 1.3435 - val_accuracy: 0.6569
Epoch 141/200
133/133 [==============================] - 1s 6ms/step - loss: 0.6394 - accuracy: 0.8358 - val_loss: 1.3241 - val_accuracy: 0.6621
Epoch 142/200
133/133 [==============================] - 1s 5ms/step - loss: 0.6324 - accuracy: 0.8339 - val_loss: 1.3843 - val_accuracy: 0.6628
Epoch 143/200
133/133 [==============================] - 1s 6ms/step - loss: 0.6317 - accuracy: 0.8308 - val_loss: 1.3066 - val_accuracy: 0.6751
Epoch 144/200
133/133 [==============================] - 1s 6ms/step - loss: 0.6338 - accuracy: 0.8291 - val_loss: 1.2733 - val_accuracy: 0.6732
Epoch 145/200
133/133 [==============================] - 1s 6ms/step - loss: 0.6306 - accuracy: 0.8306 - val_loss: 1.2256 - val_accuracy: 0.6862
Epoch 146/200
133/133 [==============================] - 1s 6ms/step - loss: 0.6281 - accuracy: 0.8343 - val_loss: 1.3236 - val_accuracy: 0.6602
Epoch 147/200
133/133 [==============================] - 1s 6ms/step - loss: 0.6274 - accuracy: 0.8328 - val_loss: 1.3467 - val_accuracy: 0.6497
Epoch 148/200
133/133 [==============================] - 1s 6ms/step - loss: 0.6138 - accuracy: 0.8352 - val_loss: 1.3472 - val_accuracy: 0.6602
Epoch 149/200
133/133 [==============================] - 1s 6ms/step - loss: 0.5921 - accuracy: 0.8474 - val_loss: 1.3324 - val_accuracy: 0.6693
Epoch 150/200
133/133 [==============================] - 1s 6ms/step - loss: 0.5992 - accuracy: 0.8425 - val_loss: 1.3722 - val_accuracy: 0.6608
Epoch 151/200
133/133 [==============================] - 1s 6ms/step - loss: 0.6073 - accuracy: 0.8401 - val_loss: 1.4474 - val_accuracy: 0.6497
Epoch 152/200
133/133 [==============================] - 1s 6ms/step - loss: 0.5974 - accuracy: 0.8434 - val_loss: 1.2729 - val_accuracy: 0.6725
Epoch 153/200
133/133 [==============================] - 1s 6ms/step - loss: 0.6279 - accuracy: 0.8334 - val_loss: 1.3094 - val_accuracy: 0.6641
Epoch 154/200
133/133 [==============================] - 1s 6ms/step - loss: 0.6030 - accuracy: 0.8499 - val_loss: 1.3338 - val_accuracy: 0.6634
Epoch 155/200
133/133 [==============================] - 1s 6ms/step - loss: 0.6102 - accuracy: 0.8329 - val_loss: 1.2603 - val_accuracy: 0.6777
Epoch 156/200
133/133 [==============================] - 1s 6ms/step - loss: 0.6443 - accuracy: 0.8266 - val_loss: 1.3074 - val_accuracy: 0.6595
Epoch 157/200
133/133 [==============================] - 1s 6ms/step - loss: 0.6356 - accuracy: 0.8245 - val_loss: 1.2439 - val_accuracy: 0.6810
Epoch 158/200
133/133 [==============================] - 1s 6ms/step - loss: 0.5948 - accuracy: 0.8475 - val_loss: 1.4232 - val_accuracy: 0.6517
Epoch 159/200
133/133 [==============================] - 1s 6ms/step - loss: 0.5992 - accuracy: 0.8448 - val_loss: 1.2881 - val_accuracy: 0.6621
Epoch 160/200
133/133 [==============================] - 1s 6ms/step - loss: 0.5898 - accuracy: 0.8408 - val_loss: 1.3859 - val_accuracy: 0.6576
Epoch 161/200
133/133 [==============================] - 1s 6ms/step - loss: 0.5677 - accuracy: 0.8552 - val_loss: 1.4007 - val_accuracy: 0.6523
Epoch 162/200
133/133 [==============================] - 1s 6ms/step - loss: 0.5685 - accuracy: 0.8502 - val_loss: 1.3599 - val_accuracy: 0.6576
Epoch 163/200
133/133 [==============================] - 1s 6ms/step - loss: 0.5939 - accuracy: 0.8460 - val_loss: 1.2405 - val_accuracy: 0.6855
Epoch 164/200
133/133 [==============================] - 1s 6ms/step - loss: 0.6070 - accuracy: 0.8411 - val_loss: 1.4595 - val_accuracy: 0.6484
Epoch 165/200
133/133 [==============================] - 1s 5ms/step - loss: 0.5836 - accuracy: 0.8460 - val_loss: 1.2698 - val_accuracy: 0.6803
In [ ]:
_, accuracy = model_report(SIMPLE_MODEL_OPTIMIZED, SIMPLE_MODEL_OPTIMIZED_history)
accuracies_opt_64["SIMPLE_MODEL"] = accuracy
Test set evaluation metrics
---------------------------
Loss:     1.257
Accuracy: 67.725%
CNN1
In [ ]:
CNN1_MODEL_OPTIMIZED = init_cnn1_model_optimized(summary = True)
CNN1_MODEL_OPTIMIZED_history = train_model(CNN1_MODEL_OPTIMIZED, epochs = 200, callbacks=[callback])
Model: "sequential_1"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
conv2d_3 (Conv2D)            (None, 30, 30, 32)        896       
_________________________________________________________________
batch_normalization_3 (Batch (None, 30, 30, 32)        128       
_________________________________________________________________
re_lu_3 (ReLU)               (None, 30, 30, 32)        0         
_________________________________________________________________
max_pooling2d_2 (MaxPooling2 (None, 15, 15, 32)        0         
_________________________________________________________________
dropout_3 (Dropout)          (None, 15, 15, 32)        0         
_________________________________________________________________
conv2d_4 (Conv2D)            (None, 13, 13, 64)        18496     
_________________________________________________________________
batch_normalization_4 (Batch (None, 13, 13, 64)        256       
_________________________________________________________________
re_lu_4 (ReLU)               (None, 13, 13, 64)        0         
_________________________________________________________________
max_pooling2d_3 (MaxPooling2 (None, 6, 6, 64)          0         
_________________________________________________________________
dropout_4 (Dropout)          (None, 6, 6, 64)          0         
_________________________________________________________________
conv2d_5 (Conv2D)            (None, 4, 4, 128)         73856     
_________________________________________________________________
batch_normalization_5 (Batch (None, 4, 4, 128)         512       
_________________________________________________________________
re_lu_5 (ReLU)               (None, 4, 4, 128)         0         
_________________________________________________________________
average_pooling2d (AveragePo (None, 2, 2, 128)         0         
_________________________________________________________________
dropout_5 (Dropout)          (None, 2, 2, 128)         0         
_________________________________________________________________
flatten_1 (Flatten)          (None, 512)               0         
_________________________________________________________________
dense_2 (Dense)              (None, 1024)              525312    
_________________________________________________________________
dropout_6 (Dropout)          (None, 1024)              0         
_________________________________________________________________
dense_3 (Dense)              (None, 20)                20500     
=================================================================
Total params: 639,956
Trainable params: 639,508
Non-trainable params: 448
_________________________________________________________________
Epoch 1/200
133/133 [==============================] - 2s 7ms/step - loss: 4.2530 - accuracy: 0.1115 - val_loss: 4.3673 - val_accuracy: 0.0534
Epoch 2/200
133/133 [==============================] - 1s 6ms/step - loss: 3.7286 - accuracy: 0.2410 - val_loss: 4.4686 - val_accuracy: 0.0827
Epoch 3/200
133/133 [==============================] - 1s 6ms/step - loss: 3.4549 - accuracy: 0.2975 - val_loss: 4.1141 - val_accuracy: 0.1315
Epoch 4/200
133/133 [==============================] - 1s 6ms/step - loss: 3.2565 - accuracy: 0.3392 - val_loss: 3.6171 - val_accuracy: 0.2051
Epoch 5/200
133/133 [==============================] - 1s 6ms/step - loss: 3.0617 - accuracy: 0.3701 - val_loss: 3.1100 - val_accuracy: 0.3535
Epoch 6/200
133/133 [==============================] - 1s 6ms/step - loss: 2.8891 - accuracy: 0.4037 - val_loss: 3.0242 - val_accuracy: 0.3451
Epoch 7/200
133/133 [==============================] - 1s 6ms/step - loss: 2.7791 - accuracy: 0.4170 - val_loss: 2.7819 - val_accuracy: 0.4154
Epoch 8/200
133/133 [==============================] - 1s 6ms/step - loss: 2.6538 - accuracy: 0.4382 - val_loss: 2.8114 - val_accuracy: 0.4017
Epoch 9/200
133/133 [==============================] - 1s 6ms/step - loss: 2.5411 - accuracy: 0.4612 - val_loss: 3.0905 - val_accuracy: 0.3229
Epoch 10/200
133/133 [==============================] - 1s 6ms/step - loss: 2.4636 - accuracy: 0.4675 - val_loss: 2.5844 - val_accuracy: 0.4499
Epoch 11/200
133/133 [==============================] - 1s 6ms/step - loss: 2.3739 - accuracy: 0.4830 - val_loss: 2.7285 - val_accuracy: 0.4186
Epoch 12/200
133/133 [==============================] - 1s 6ms/step - loss: 2.2637 - accuracy: 0.5061 - val_loss: 2.5235 - val_accuracy: 0.4531
Epoch 13/200
133/133 [==============================] - 1s 6ms/step - loss: 2.2329 - accuracy: 0.5105 - val_loss: 2.6218 - val_accuracy: 0.4258
Epoch 14/200
133/133 [==============================] - 1s 6ms/step - loss: 2.1545 - accuracy: 0.5157 - val_loss: 2.4684 - val_accuracy: 0.4616
Epoch 15/200
133/133 [==============================] - 1s 6ms/step - loss: 2.0826 - accuracy: 0.5338 - val_loss: 2.5526 - val_accuracy: 0.4277
Epoch 16/200
133/133 [==============================] - 1s 6ms/step - loss: 2.0292 - accuracy: 0.5388 - val_loss: 2.4377 - val_accuracy: 0.4577
Epoch 17/200
133/133 [==============================] - 1s 6ms/step - loss: 1.9524 - accuracy: 0.5683 - val_loss: 2.4029 - val_accuracy: 0.4492
Epoch 18/200
133/133 [==============================] - 1s 6ms/step - loss: 1.9306 - accuracy: 0.5676 - val_loss: 2.5437 - val_accuracy: 0.4401
Epoch 19/200
133/133 [==============================] - 1s 6ms/step - loss: 1.8518 - accuracy: 0.5726 - val_loss: 2.3568 - val_accuracy: 0.4681
Epoch 20/200
133/133 [==============================] - 1s 6ms/step - loss: 1.8160 - accuracy: 0.5814 - val_loss: 2.2945 - val_accuracy: 0.4674
Epoch 21/200
133/133 [==============================] - 1s 6ms/step - loss: 1.7904 - accuracy: 0.5922 - val_loss: 2.2835 - val_accuracy: 0.4701
Epoch 22/200
133/133 [==============================] - 1s 6ms/step - loss: 1.7413 - accuracy: 0.5986 - val_loss: 2.1282 - val_accuracy: 0.5059
Epoch 23/200
133/133 [==============================] - 1s 6ms/step - loss: 1.7250 - accuracy: 0.6006 - val_loss: 2.1296 - val_accuracy: 0.5026
Epoch 24/200
133/133 [==============================] - 1s 6ms/step - loss: 1.6863 - accuracy: 0.6068 - val_loss: 2.1190 - val_accuracy: 0.5163
Epoch 25/200
133/133 [==============================] - 1s 6ms/step - loss: 1.6468 - accuracy: 0.6195 - val_loss: 2.0485 - val_accuracy: 0.5215
Epoch 26/200
133/133 [==============================] - 1s 6ms/step - loss: 1.6225 - accuracy: 0.6157 - val_loss: 2.1690 - val_accuracy: 0.4889
Epoch 27/200
133/133 [==============================] - 1s 6ms/step - loss: 1.5725 - accuracy: 0.6320 - val_loss: 1.8057 - val_accuracy: 0.5755
Epoch 28/200
133/133 [==============================] - 1s 6ms/step - loss: 1.5758 - accuracy: 0.6290 - val_loss: 2.2410 - val_accuracy: 0.4766
Epoch 29/200
133/133 [==============================] - 1s 6ms/step - loss: 1.5280 - accuracy: 0.6349 - val_loss: 1.8776 - val_accuracy: 0.5488
Epoch 30/200
133/133 [==============================] - 1s 6ms/step - loss: 1.4797 - accuracy: 0.6507 - val_loss: 1.9218 - val_accuracy: 0.5501
Epoch 31/200
133/133 [==============================] - 1s 6ms/step - loss: 1.4787 - accuracy: 0.6448 - val_loss: 2.1400 - val_accuracy: 0.5085
Epoch 32/200
133/133 [==============================] - 1s 6ms/step - loss: 1.4358 - accuracy: 0.6548 - val_loss: 1.7207 - val_accuracy: 0.5801
Epoch 33/200
133/133 [==============================] - 1s 6ms/step - loss: 1.3994 - accuracy: 0.6626 - val_loss: 1.7434 - val_accuracy: 0.5814
Epoch 34/200
133/133 [==============================] - 1s 6ms/step - loss: 1.4075 - accuracy: 0.6596 - val_loss: 1.7541 - val_accuracy: 0.5684
Epoch 35/200
133/133 [==============================] - 1s 6ms/step - loss: 1.3853 - accuracy: 0.6608 - val_loss: 1.7472 - val_accuracy: 0.5723
Epoch 36/200
133/133 [==============================] - 1s 6ms/step - loss: 1.3148 - accuracy: 0.6855 - val_loss: 1.6112 - val_accuracy: 0.6139
Epoch 37/200
133/133 [==============================] - 1s 6ms/step - loss: 1.3224 - accuracy: 0.6782 - val_loss: 1.5494 - val_accuracy: 0.6113
Epoch 38/200
133/133 [==============================] - 1s 6ms/step - loss: 1.3201 - accuracy: 0.6800 - val_loss: 1.6270 - val_accuracy: 0.5990
Epoch 39/200
133/133 [==============================] - 1s 6ms/step - loss: 1.2828 - accuracy: 0.6879 - val_loss: 1.6597 - val_accuracy: 0.5924
Epoch 40/200
133/133 [==============================] - 1s 6ms/step - loss: 1.2724 - accuracy: 0.6935 - val_loss: 1.9796 - val_accuracy: 0.5247
Epoch 41/200
133/133 [==============================] - 1s 6ms/step - loss: 1.2489 - accuracy: 0.6967 - val_loss: 2.0247 - val_accuracy: 0.5306
Epoch 42/200
133/133 [==============================] - 1s 6ms/step - loss: 1.2488 - accuracy: 0.6914 - val_loss: 1.8544 - val_accuracy: 0.5404
Epoch 43/200
133/133 [==============================] - 1s 6ms/step - loss: 1.2407 - accuracy: 0.6916 - val_loss: 1.6547 - val_accuracy: 0.5827
Epoch 44/200
133/133 [==============================] - 1s 6ms/step - loss: 1.1823 - accuracy: 0.7043 - val_loss: 1.6884 - val_accuracy: 0.5872
Epoch 45/200
133/133 [==============================] - 1s 6ms/step - loss: 1.2161 - accuracy: 0.7001 - val_loss: 1.7319 - val_accuracy: 0.5716
Epoch 46/200
133/133 [==============================] - 1s 6ms/step - loss: 1.1597 - accuracy: 0.7130 - val_loss: 1.5896 - val_accuracy: 0.5957
Epoch 47/200
133/133 [==============================] - 1s 6ms/step - loss: 1.1438 - accuracy: 0.7114 - val_loss: 1.6829 - val_accuracy: 0.5990
Epoch 48/200
133/133 [==============================] - 1s 6ms/step - loss: 1.1455 - accuracy: 0.7169 - val_loss: 1.5802 - val_accuracy: 0.6152
Epoch 49/200
133/133 [==============================] - 1s 6ms/step - loss: 1.1325 - accuracy: 0.7217 - val_loss: 1.4591 - val_accuracy: 0.6374
Epoch 50/200
133/133 [==============================] - 1s 6ms/step - loss: 1.1040 - accuracy: 0.7293 - val_loss: 1.3835 - val_accuracy: 0.6628
Epoch 51/200
133/133 [==============================] - 1s 6ms/step - loss: 1.1094 - accuracy: 0.7240 - val_loss: 1.6999 - val_accuracy: 0.5736
Epoch 52/200
133/133 [==============================] - 1s 6ms/step - loss: 1.0872 - accuracy: 0.7226 - val_loss: 1.5695 - val_accuracy: 0.6139
Epoch 53/200
133/133 [==============================] - 1s 6ms/step - loss: 1.0762 - accuracy: 0.7267 - val_loss: 1.5590 - val_accuracy: 0.6185
Epoch 54/200
133/133 [==============================] - 1s 6ms/step - loss: 1.0649 - accuracy: 0.7314 - val_loss: 1.5393 - val_accuracy: 0.6133
Epoch 55/200
133/133 [==============================] - 1s 6ms/step - loss: 1.0204 - accuracy: 0.7476 - val_loss: 1.4457 - val_accuracy: 0.6380
Epoch 56/200
133/133 [==============================] - 1s 6ms/step - loss: 1.0347 - accuracy: 0.7480 - val_loss: 1.4870 - val_accuracy: 0.6257
Epoch 57/200
133/133 [==============================] - 1s 6ms/step - loss: 0.9946 - accuracy: 0.7509 - val_loss: 1.6789 - val_accuracy: 0.5801
Epoch 58/200
133/133 [==============================] - 1s 6ms/step - loss: 1.0072 - accuracy: 0.7401 - val_loss: 1.3667 - val_accuracy: 0.6530
Epoch 59/200
133/133 [==============================] - 1s 6ms/step - loss: 0.9901 - accuracy: 0.7514 - val_loss: 1.4512 - val_accuracy: 0.6315
Epoch 60/200
133/133 [==============================] - 1s 6ms/step - loss: 0.9874 - accuracy: 0.7492 - val_loss: 1.5001 - val_accuracy: 0.6367
Epoch 61/200
133/133 [==============================] - 1s 6ms/step - loss: 0.9721 - accuracy: 0.7520 - val_loss: 1.4998 - val_accuracy: 0.6361
Epoch 62/200
133/133 [==============================] - 1s 6ms/step - loss: 0.9593 - accuracy: 0.7570 - val_loss: 1.3549 - val_accuracy: 0.6634
Epoch 63/200
133/133 [==============================] - 1s 6ms/step - loss: 0.9368 - accuracy: 0.7637 - val_loss: 1.2913 - val_accuracy: 0.6706
Epoch 64/200
133/133 [==============================] - 1s 6ms/step - loss: 0.9322 - accuracy: 0.7646 - val_loss: 1.5140 - val_accuracy: 0.6335
Epoch 65/200
133/133 [==============================] - 1s 6ms/step - loss: 0.9119 - accuracy: 0.7695 - val_loss: 1.2705 - val_accuracy: 0.6777
Epoch 66/200
133/133 [==============================] - 1s 6ms/step - loss: 0.9234 - accuracy: 0.7613 - val_loss: 1.3908 - val_accuracy: 0.6523
Epoch 67/200
133/133 [==============================] - 1s 6ms/step - loss: 0.8953 - accuracy: 0.7735 - val_loss: 1.3720 - val_accuracy: 0.6628
Epoch 68/200
133/133 [==============================] - 1s 6ms/step - loss: 0.9091 - accuracy: 0.7680 - val_loss: 1.3321 - val_accuracy: 0.6562
Epoch 69/200
133/133 [==============================] - 1s 6ms/step - loss: 0.8993 - accuracy: 0.7691 - val_loss: 1.3884 - val_accuracy: 0.6439
Epoch 70/200
133/133 [==============================] - 1s 6ms/step - loss: 0.8749 - accuracy: 0.7767 - val_loss: 1.3185 - val_accuracy: 0.6680
Epoch 71/200
133/133 [==============================] - 1s 6ms/step - loss: 0.8461 - accuracy: 0.7865 - val_loss: 1.3386 - val_accuracy: 0.6673
Epoch 72/200
133/133 [==============================] - 1s 6ms/step - loss: 0.8545 - accuracy: 0.7812 - val_loss: 1.3809 - val_accuracy: 0.6530
Epoch 73/200
133/133 [==============================] - 1s 6ms/step - loss: 0.8456 - accuracy: 0.7830 - val_loss: 1.4510 - val_accuracy: 0.6341
Epoch 74/200
133/133 [==============================] - 1s 6ms/step - loss: 0.8505 - accuracy: 0.7854 - val_loss: 1.3313 - val_accuracy: 0.6803
Epoch 75/200
133/133 [==============================] - 1s 6ms/step - loss: 0.8285 - accuracy: 0.7903 - val_loss: 1.2461 - val_accuracy: 0.6927
Epoch 76/200
133/133 [==============================] - 1s 6ms/step - loss: 0.8175 - accuracy: 0.7993 - val_loss: 1.2958 - val_accuracy: 0.6667
Epoch 77/200
133/133 [==============================] - 1s 6ms/step - loss: 0.8069 - accuracy: 0.7964 - val_loss: 1.3874 - val_accuracy: 0.6549
Epoch 78/200
133/133 [==============================] - 1s 6ms/step - loss: 0.8252 - accuracy: 0.7867 - val_loss: 1.3470 - val_accuracy: 0.6602
Epoch 79/200
133/133 [==============================] - 1s 6ms/step - loss: 0.8167 - accuracy: 0.7933 - val_loss: 1.3135 - val_accuracy: 0.6712
Epoch 80/200
133/133 [==============================] - 1s 6ms/step - loss: 0.7912 - accuracy: 0.7963 - val_loss: 1.4937 - val_accuracy: 0.6322
Epoch 81/200
133/133 [==============================] - 1s 6ms/step - loss: 0.7896 - accuracy: 0.7972 - val_loss: 1.3718 - val_accuracy: 0.6621
Epoch 82/200
133/133 [==============================] - 1s 6ms/step - loss: 0.7876 - accuracy: 0.7950 - val_loss: 1.2166 - val_accuracy: 0.7044
Epoch 83/200
133/133 [==============================] - 1s 6ms/step - loss: 0.7717 - accuracy: 0.8018 - val_loss: 1.3479 - val_accuracy: 0.6602
Epoch 84/200
133/133 [==============================] - 1s 6ms/step - loss: 0.7686 - accuracy: 0.8061 - val_loss: 1.2684 - val_accuracy: 0.6764
Epoch 85/200
133/133 [==============================] - 1s 6ms/step - loss: 0.7683 - accuracy: 0.8011 - val_loss: 1.3829 - val_accuracy: 0.6530
Epoch 86/200
133/133 [==============================] - 1s 6ms/step - loss: 0.7557 - accuracy: 0.8003 - val_loss: 1.2790 - val_accuracy: 0.6855
Epoch 87/200
133/133 [==============================] - 1s 6ms/step - loss: 0.7420 - accuracy: 0.8055 - val_loss: 1.4198 - val_accuracy: 0.6504
Epoch 88/200
133/133 [==============================] - 1s 6ms/step - loss: 0.7567 - accuracy: 0.8009 - val_loss: 1.3232 - val_accuracy: 0.6712
Epoch 89/200
133/133 [==============================] - 1s 6ms/step - loss: 0.7193 - accuracy: 0.8204 - val_loss: 1.2454 - val_accuracy: 0.6888
Epoch 90/200
133/133 [==============================] - 1s 6ms/step - loss: 0.7098 - accuracy: 0.8186 - val_loss: 1.3432 - val_accuracy: 0.6725
Epoch 91/200
133/133 [==============================] - 1s 6ms/step - loss: 0.7168 - accuracy: 0.8139 - val_loss: 1.3820 - val_accuracy: 0.6530
Epoch 92/200
133/133 [==============================] - 1s 6ms/step - loss: 0.7049 - accuracy: 0.8230 - val_loss: 1.3251 - val_accuracy: 0.6823
Epoch 93/200
133/133 [==============================] - 1s 6ms/step - loss: 0.7035 - accuracy: 0.8262 - val_loss: 1.4510 - val_accuracy: 0.6497
Epoch 94/200
133/133 [==============================] - 1s 6ms/step - loss: 0.7211 - accuracy: 0.8126 - val_loss: 1.3055 - val_accuracy: 0.6836
Epoch 95/200
133/133 [==============================] - 1s 6ms/step - loss: 0.6837 - accuracy: 0.8206 - val_loss: 1.3363 - val_accuracy: 0.6719
Epoch 96/200
133/133 [==============================] - 1s 6ms/step - loss: 0.6915 - accuracy: 0.8190 - val_loss: 1.2766 - val_accuracy: 0.6888
Epoch 97/200
133/133 [==============================] - 1s 6ms/step - loss: 0.6768 - accuracy: 0.8233 - val_loss: 1.2723 - val_accuracy: 0.6914
Epoch 98/200
133/133 [==============================] - 1s 6ms/step - loss: 0.6836 - accuracy: 0.8262 - val_loss: 1.2375 - val_accuracy: 0.6875
Epoch 99/200
133/133 [==============================] - 1s 6ms/step - loss: 0.6708 - accuracy: 0.8222 - val_loss: 1.3011 - val_accuracy: 0.6725
Epoch 100/200
133/133 [==============================] - 1s 6ms/step - loss: 0.6724 - accuracy: 0.8253 - val_loss: 1.3630 - val_accuracy: 0.6660
Epoch 101/200
133/133 [==============================] - 1s 6ms/step - loss: 0.6712 - accuracy: 0.8252 - val_loss: 1.3064 - val_accuracy: 0.6732
Epoch 102/200
133/133 [==============================] - 1s 6ms/step - loss: 0.6595 - accuracy: 0.8319 - val_loss: 1.2169 - val_accuracy: 0.6973
In [ ]:
_, accuracy = model_report(CNN1_MODEL_OPTIMIZED, CNN1_MODEL_OPTIMIZED_history)
accuracies_opt_64["CNN1"] = accuracy
Test set evaluation metrics
---------------------------
Loss:     1.226
Accuracy: 67.676%
CNN2
In [ ]:
CNN2_MODEL_OPTIMIZED = init_cnn2_model_optimized(summary = True)
CNN2_MODEL_OPTIMIZED_history = train_model(CNN2_MODEL_OPTIMIZED, epochs = 200, callbacks=[callback])
Model: "sequential_2"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
conv2d_6 (Conv2D)            (None, 32, 32, 32)        896       
_________________________________________________________________
batch_normalization_6 (Batch (None, 32, 32, 32)        128       
_________________________________________________________________
re_lu_6 (ReLU)               (None, 32, 32, 32)        0         
_________________________________________________________________
max_pooling2d_4 (MaxPooling2 (None, 16, 16, 32)        0         
_________________________________________________________________
dropout_7 (Dropout)          (None, 16, 16, 32)        0         
_________________________________________________________________
conv2d_7 (Conv2D)            (None, 16, 16, 64)        18496     
_________________________________________________________________
batch_normalization_7 (Batch (None, 16, 16, 64)        256       
_________________________________________________________________
re_lu_7 (ReLU)               (None, 16, 16, 64)        0         
_________________________________________________________________
max_pooling2d_5 (MaxPooling2 (None, 8, 8, 64)          0         
_________________________________________________________________
dropout_8 (Dropout)          (None, 8, 8, 64)          0         
_________________________________________________________________
conv2d_8 (Conv2D)            (None, 8, 8, 128)         73856     
_________________________________________________________________
batch_normalization_8 (Batch (None, 8, 8, 128)         512       
_________________________________________________________________
re_lu_8 (ReLU)               (None, 8, 8, 128)         0         
_________________________________________________________________
max_pooling2d_6 (MaxPooling2 (None, 4, 4, 128)         0         
_________________________________________________________________
dropout_9 (Dropout)          (None, 4, 4, 128)         0         
_________________________________________________________________
conv2d_9 (Conv2D)            (None, 4, 4, 256)         295168    
_________________________________________________________________
batch_normalization_9 (Batch (None, 4, 4, 256)         1024      
_________________________________________________________________
re_lu_9 (ReLU)               (None, 4, 4, 256)         0         
_________________________________________________________________
dropout_10 (Dropout)         (None, 4, 4, 256)         0         
_________________________________________________________________
flatten_2 (Flatten)          (None, 4096)              0         
_________________________________________________________________
dense_4 (Dense)              (None, 512)               2097664   
_________________________________________________________________
dropout_11 (Dropout)         (None, 512)               0         
_________________________________________________________________
dense_5 (Dense)              (None, 20)                10260     
=================================================================
Total params: 2,498,260
Trainable params: 2,497,300
Non-trainable params: 960
_________________________________________________________________
Epoch 1/200
133/133 [==============================] - 2s 8ms/step - loss: 6.0596 - accuracy: 0.1055 - val_loss: 6.0700 - val_accuracy: 0.0462
Epoch 2/200
133/133 [==============================] - 1s 7ms/step - loss: 5.3558 - accuracy: 0.2280 - val_loss: 6.3054 - val_accuracy: 0.0482
Epoch 3/200
133/133 [==============================] - 1s 7ms/step - loss: 5.0500 - accuracy: 0.2728 - val_loss: 5.7561 - val_accuracy: 0.1061
Epoch 4/200
133/133 [==============================] - 1s 7ms/step - loss: 4.7332 - accuracy: 0.3254 - val_loss: 5.1590 - val_accuracy: 0.1947
Epoch 5/200
133/133 [==============================] - 1s 7ms/step - loss: 4.4968 - accuracy: 0.3551 - val_loss: 4.8440 - val_accuracy: 0.2454
Epoch 6/200
133/133 [==============================] - 1s 7ms/step - loss: 4.2382 - accuracy: 0.4074 - val_loss: 4.5126 - val_accuracy: 0.3171
Epoch 7/200
133/133 [==============================] - 1s 7ms/step - loss: 4.0748 - accuracy: 0.4103 - val_loss: 4.6603 - val_accuracy: 0.2669
Epoch 8/200
133/133 [==============================] - 1s 7ms/step - loss: 3.8724 - accuracy: 0.4370 - val_loss: 4.4318 - val_accuracy: 0.2871
Epoch 9/200
133/133 [==============================] - 1s 7ms/step - loss: 3.7006 - accuracy: 0.4577 - val_loss: 4.7249 - val_accuracy: 0.2467
Epoch 10/200
133/133 [==============================] - 1s 7ms/step - loss: 3.5130 - accuracy: 0.4774 - val_loss: 4.1595 - val_accuracy: 0.3294
Epoch 11/200
133/133 [==============================] - 1s 7ms/step - loss: 3.3870 - accuracy: 0.4879 - val_loss: 4.2865 - val_accuracy: 0.2975
Epoch 12/200
133/133 [==============================] - 1s 7ms/step - loss: 3.1940 - accuracy: 0.5111 - val_loss: 4.0032 - val_accuracy: 0.3457
Epoch 13/200
133/133 [==============================] - 1s 7ms/step - loss: 3.0978 - accuracy: 0.5183 - val_loss: 4.1383 - val_accuracy: 0.3105
Epoch 14/200
133/133 [==============================] - 1s 7ms/step - loss: 2.9787 - accuracy: 0.5389 - val_loss: 3.9492 - val_accuracy: 0.3346
Epoch 15/200
133/133 [==============================] - 1s 7ms/step - loss: 2.8410 - accuracy: 0.5453 - val_loss: 3.6921 - val_accuracy: 0.3789
Epoch 16/200
133/133 [==============================] - 1s 7ms/step - loss: 2.7232 - accuracy: 0.5647 - val_loss: 3.3955 - val_accuracy: 0.4128
Epoch 17/200
133/133 [==============================] - 1s 7ms/step - loss: 2.6071 - accuracy: 0.5716 - val_loss: 3.1726 - val_accuracy: 0.4310
Epoch 18/200
133/133 [==============================] - 1s 7ms/step - loss: 2.5277 - accuracy: 0.5924 - val_loss: 3.5146 - val_accuracy: 0.3678
Epoch 19/200
133/133 [==============================] - 1s 7ms/step - loss: 2.4289 - accuracy: 0.5930 - val_loss: 3.3611 - val_accuracy: 0.4036
Epoch 20/200
133/133 [==============================] - 1s 7ms/step - loss: 2.3678 - accuracy: 0.6038 - val_loss: 3.1244 - val_accuracy: 0.4355
Epoch 21/200
133/133 [==============================] - 1s 7ms/step - loss: 2.2575 - accuracy: 0.6150 - val_loss: 3.3300 - val_accuracy: 0.3854
Epoch 22/200
133/133 [==============================] - 1s 7ms/step - loss: 2.1638 - accuracy: 0.6259 - val_loss: 2.9901 - val_accuracy: 0.4447
Epoch 23/200
133/133 [==============================] - 1s 7ms/step - loss: 2.1086 - accuracy: 0.6388 - val_loss: 2.7598 - val_accuracy: 0.4883
Epoch 24/200
133/133 [==============================] - 1s 7ms/step - loss: 2.0167 - accuracy: 0.6542 - val_loss: 2.7263 - val_accuracy: 0.4811
Epoch 25/200
133/133 [==============================] - 1s 7ms/step - loss: 1.9672 - accuracy: 0.6581 - val_loss: 2.7782 - val_accuracy: 0.4753
Epoch 26/200
133/133 [==============================] - 1s 7ms/step - loss: 1.8781 - accuracy: 0.6725 - val_loss: 2.6552 - val_accuracy: 0.4831
Epoch 27/200
133/133 [==============================] - 1s 7ms/step - loss: 1.8272 - accuracy: 0.6681 - val_loss: 2.5290 - val_accuracy: 0.5026
Epoch 28/200
133/133 [==============================] - 1s 7ms/step - loss: 1.7602 - accuracy: 0.6871 - val_loss: 2.5660 - val_accuracy: 0.5117
Epoch 29/200
133/133 [==============================] - 1s 7ms/step - loss: 1.7161 - accuracy: 0.6908 - val_loss: 2.6833 - val_accuracy: 0.4772
Epoch 30/200
133/133 [==============================] - 1s 7ms/step - loss: 1.6681 - accuracy: 0.6995 - val_loss: 2.7234 - val_accuracy: 0.4661
Epoch 31/200
133/133 [==============================] - 1s 7ms/step - loss: 1.6144 - accuracy: 0.7104 - val_loss: 2.9461 - val_accuracy: 0.4258
Epoch 32/200
133/133 [==============================] - 1s 7ms/step - loss: 1.5425 - accuracy: 0.7223 - val_loss: 2.3017 - val_accuracy: 0.5475
Epoch 33/200
133/133 [==============================] - 1s 7ms/step - loss: 1.5166 - accuracy: 0.7162 - val_loss: 2.2398 - val_accuracy: 0.5540
Epoch 34/200
133/133 [==============================] - 1s 7ms/step - loss: 1.4591 - accuracy: 0.7299 - val_loss: 2.0631 - val_accuracy: 0.5840
Epoch 35/200
133/133 [==============================] - 1s 7ms/step - loss: 1.4104 - accuracy: 0.7419 - val_loss: 2.1394 - val_accuracy: 0.5742
Epoch 36/200
133/133 [==============================] - 1s 7ms/step - loss: 1.3819 - accuracy: 0.7442 - val_loss: 2.1929 - val_accuracy: 0.5534
Epoch 37/200
133/133 [==============================] - 1s 7ms/step - loss: 1.3230 - accuracy: 0.7699 - val_loss: 2.6478 - val_accuracy: 0.4766
Epoch 38/200
133/133 [==============================] - 1s 7ms/step - loss: 1.2833 - accuracy: 0.7639 - val_loss: 2.3734 - val_accuracy: 0.5117
Epoch 39/200
133/133 [==============================] - 1s 7ms/step - loss: 1.2411 - accuracy: 0.7728 - val_loss: 1.9703 - val_accuracy: 0.5938
Epoch 40/200
133/133 [==============================] - 1s 7ms/step - loss: 1.2119 - accuracy: 0.7770 - val_loss: 2.0823 - val_accuracy: 0.5703
Epoch 41/200
133/133 [==============================] - 1s 7ms/step - loss: 1.1828 - accuracy: 0.7786 - val_loss: 2.0745 - val_accuracy: 0.5807
Epoch 42/200
133/133 [==============================] - 1s 7ms/step - loss: 1.1495 - accuracy: 0.7859 - val_loss: 2.1052 - val_accuracy: 0.5605
Epoch 43/200
133/133 [==============================] - 1s 7ms/step - loss: 1.1038 - accuracy: 0.7994 - val_loss: 1.8129 - val_accuracy: 0.6172
Epoch 44/200
133/133 [==============================] - 1s 7ms/step - loss: 1.0689 - accuracy: 0.8063 - val_loss: 2.0445 - val_accuracy: 0.5827
Epoch 45/200
133/133 [==============================] - 1s 7ms/step - loss: 1.0490 - accuracy: 0.8027 - val_loss: 1.9563 - val_accuracy: 0.5931
Epoch 46/200
133/133 [==============================] - 1s 7ms/step - loss: 1.0100 - accuracy: 0.8176 - val_loss: 2.0169 - val_accuracy: 0.5671
Epoch 47/200
133/133 [==============================] - 1s 7ms/step - loss: 0.9867 - accuracy: 0.8183 - val_loss: 1.8934 - val_accuracy: 0.6074
Epoch 48/200
133/133 [==============================] - 1s 7ms/step - loss: 0.9618 - accuracy: 0.8298 - val_loss: 2.0402 - val_accuracy: 0.5820
Epoch 49/200
133/133 [==============================] - 1s 7ms/step - loss: 0.9418 - accuracy: 0.8286 - val_loss: 2.0741 - val_accuracy: 0.5801
Epoch 50/200
133/133 [==============================] - 1s 7ms/step - loss: 0.9089 - accuracy: 0.8355 - val_loss: 1.8453 - val_accuracy: 0.6211
Epoch 51/200
133/133 [==============================] - 1s 7ms/step - loss: 0.8854 - accuracy: 0.8395 - val_loss: 1.7665 - val_accuracy: 0.6172
Epoch 52/200
133/133 [==============================] - 1s 7ms/step - loss: 0.8906 - accuracy: 0.8354 - val_loss: 1.7670 - val_accuracy: 0.6237
Epoch 53/200
133/133 [==============================] - 1s 7ms/step - loss: 0.8573 - accuracy: 0.8451 - val_loss: 1.9158 - val_accuracy: 0.6120
Epoch 54/200
133/133 [==============================] - 1s 7ms/step - loss: 0.8240 - accuracy: 0.8531 - val_loss: 1.8247 - val_accuracy: 0.6165
Epoch 55/200
133/133 [==============================] - 1s 7ms/step - loss: 0.7930 - accuracy: 0.8606 - val_loss: 1.8162 - val_accuracy: 0.6217
Epoch 56/200
133/133 [==============================] - 1s 7ms/step - loss: 0.8077 - accuracy: 0.8504 - val_loss: 1.9507 - val_accuracy: 0.5944
Epoch 57/200
133/133 [==============================] - 1s 7ms/step - loss: 0.7667 - accuracy: 0.8657 - val_loss: 1.6481 - val_accuracy: 0.6491
Epoch 58/200
133/133 [==============================] - 1s 7ms/step - loss: 0.7474 - accuracy: 0.8666 - val_loss: 1.6704 - val_accuracy: 0.6426
Epoch 59/200
133/133 [==============================] - 1s 7ms/step - loss: 0.7396 - accuracy: 0.8649 - val_loss: 1.7608 - val_accuracy: 0.6315
Epoch 60/200
133/133 [==============================] - 1s 7ms/step - loss: 0.7086 - accuracy: 0.8760 - val_loss: 1.8439 - val_accuracy: 0.6139
Epoch 61/200
133/133 [==============================] - 1s 7ms/step - loss: 0.7004 - accuracy: 0.8794 - val_loss: 1.8012 - val_accuracy: 0.6230
Epoch 62/200
133/133 [==============================] - 1s 7ms/step - loss: 0.6991 - accuracy: 0.8742 - val_loss: 1.7178 - val_accuracy: 0.6276
Epoch 63/200
133/133 [==============================] - 1s 7ms/step - loss: 0.6724 - accuracy: 0.8858 - val_loss: 1.9991 - val_accuracy: 0.5918
Epoch 64/200
133/133 [==============================] - 1s 7ms/step - loss: 0.6673 - accuracy: 0.8823 - val_loss: 1.5361 - val_accuracy: 0.6634
Epoch 65/200
133/133 [==============================] - 1s 7ms/step - loss: 0.6457 - accuracy: 0.8859 - val_loss: 1.6602 - val_accuracy: 0.6523
Epoch 66/200
133/133 [==============================] - 1s 7ms/step - loss: 0.6334 - accuracy: 0.8927 - val_loss: 1.7453 - val_accuracy: 0.6302
Epoch 67/200
133/133 [==============================] - 1s 7ms/step - loss: 0.6127 - accuracy: 0.8929 - val_loss: 1.7124 - val_accuracy: 0.6445
Epoch 68/200
133/133 [==============================] - 1s 7ms/step - loss: 0.6150 - accuracy: 0.8928 - val_loss: 1.7759 - val_accuracy: 0.6406
Epoch 69/200
133/133 [==============================] - 1s 7ms/step - loss: 0.6086 - accuracy: 0.8904 - val_loss: 1.7303 - val_accuracy: 0.6439
Epoch 70/200
133/133 [==============================] - 1s 7ms/step - loss: 0.5636 - accuracy: 0.9080 - val_loss: 1.6300 - val_accuracy: 0.6615
Epoch 71/200
133/133 [==============================] - 1s 7ms/step - loss: 0.5623 - accuracy: 0.9101 - val_loss: 1.7380 - val_accuracy: 0.6328
Epoch 72/200
133/133 [==============================] - 1s 7ms/step - loss: 0.5735 - accuracy: 0.9002 - val_loss: 1.6116 - val_accuracy: 0.6654
Epoch 73/200
133/133 [==============================] - 1s 7ms/step - loss: 0.5466 - accuracy: 0.9135 - val_loss: 1.8313 - val_accuracy: 0.6439
Epoch 74/200
133/133 [==============================] - 1s 7ms/step - loss: 0.5513 - accuracy: 0.9009 - val_loss: 1.4843 - val_accuracy: 0.6849
Epoch 75/200
133/133 [==============================] - 1s 7ms/step - loss: 0.5462 - accuracy: 0.9089 - val_loss: 1.6370 - val_accuracy: 0.6615
Epoch 76/200
133/133 [==============================] - 1s 7ms/step - loss: 0.5272 - accuracy: 0.9103 - val_loss: 1.5224 - val_accuracy: 0.6803
Epoch 77/200
133/133 [==============================] - 1s 7ms/step - loss: 0.5211 - accuracy: 0.9123 - val_loss: 1.5692 - val_accuracy: 0.6686
Epoch 78/200
133/133 [==============================] - 1s 7ms/step - loss: 0.5041 - accuracy: 0.9212 - val_loss: 1.5740 - val_accuracy: 0.6673
Epoch 79/200
133/133 [==============================] - 1s 7ms/step - loss: 0.4953 - accuracy: 0.9217 - val_loss: 1.5433 - val_accuracy: 0.6764
Epoch 80/200
133/133 [==============================] - 1s 7ms/step - loss: 0.4936 - accuracy: 0.9224 - val_loss: 1.8186 - val_accuracy: 0.6283
Epoch 81/200
133/133 [==============================] - 1s 7ms/step - loss: 0.4815 - accuracy: 0.9250 - val_loss: 1.5310 - val_accuracy: 0.6764
Epoch 82/200
133/133 [==============================] - 1s 7ms/step - loss: 0.4754 - accuracy: 0.9243 - val_loss: 1.6597 - val_accuracy: 0.6673
Epoch 83/200
133/133 [==============================] - 1s 7ms/step - loss: 0.4609 - accuracy: 0.9256 - val_loss: 1.4760 - val_accuracy: 0.6927
Epoch 84/200
133/133 [==============================] - 1s 7ms/step - loss: 0.4579 - accuracy: 0.9284 - val_loss: 1.5608 - val_accuracy: 0.6790
Epoch 85/200
133/133 [==============================] - 1s 7ms/step - loss: 0.4474 - accuracy: 0.9306 - val_loss: 1.7452 - val_accuracy: 0.6504
Epoch 86/200
133/133 [==============================] - 1s 7ms/step - loss: 0.4584 - accuracy: 0.9268 - val_loss: 1.5842 - val_accuracy: 0.6816
Epoch 87/200
133/133 [==============================] - 1s 7ms/step - loss: 0.4501 - accuracy: 0.9242 - val_loss: 1.6723 - val_accuracy: 0.6595
Epoch 88/200
133/133 [==============================] - 1s 7ms/step - loss: 0.4340 - accuracy: 0.9313 - val_loss: 1.5904 - val_accuracy: 0.6745
Epoch 89/200
133/133 [==============================] - 1s 7ms/step - loss: 0.4269 - accuracy: 0.9327 - val_loss: 1.5486 - val_accuracy: 0.6810
Epoch 90/200
133/133 [==============================] - 1s 7ms/step - loss: 0.4358 - accuracy: 0.9299 - val_loss: 1.4770 - val_accuracy: 0.6862
Epoch 91/200
133/133 [==============================] - 1s 7ms/step - loss: 0.4143 - accuracy: 0.9420 - val_loss: 1.6525 - val_accuracy: 0.6589
Epoch 92/200
133/133 [==============================] - 1s 7ms/step - loss: 0.4308 - accuracy: 0.9316 - val_loss: 1.6807 - val_accuracy: 0.6634
Epoch 93/200
133/133 [==============================] - 1s 7ms/step - loss: 0.4243 - accuracy: 0.9274 - val_loss: 1.5858 - val_accuracy: 0.6777
Epoch 94/200
133/133 [==============================] - 1s 7ms/step - loss: 0.4159 - accuracy: 0.9355 - val_loss: 1.6280 - val_accuracy: 0.6784
Epoch 95/200
133/133 [==============================] - 1s 7ms/step - loss: 0.3915 - accuracy: 0.9446 - val_loss: 1.7714 - val_accuracy: 0.6452
Epoch 96/200
133/133 [==============================] - 1s 7ms/step - loss: 0.3970 - accuracy: 0.9411 - val_loss: 1.5961 - val_accuracy: 0.6771
Epoch 97/200
133/133 [==============================] - 1s 7ms/step - loss: 0.3907 - accuracy: 0.9432 - val_loss: 1.6225 - val_accuracy: 0.6641
Epoch 98/200
133/133 [==============================] - 1s 7ms/step - loss: 0.3928 - accuracy: 0.9400 - val_loss: 1.6378 - val_accuracy: 0.6745
Epoch 99/200
133/133 [==============================] - 1s 7ms/step - loss: 0.3932 - accuracy: 0.9379 - val_loss: 1.3914 - val_accuracy: 0.6895
Epoch 100/200
133/133 [==============================] - 1s 7ms/step - loss: 0.3770 - accuracy: 0.9427 - val_loss: 1.5542 - val_accuracy: 0.6921
Epoch 101/200
133/133 [==============================] - 1s 7ms/step - loss: 0.3656 - accuracy: 0.9464 - val_loss: 1.5084 - val_accuracy: 0.6901
Epoch 102/200
133/133 [==============================] - 1s 7ms/step - loss: 0.3601 - accuracy: 0.9501 - val_loss: 1.5894 - val_accuracy: 0.6712
Epoch 103/200
133/133 [==============================] - 1s 7ms/step - loss: 0.3837 - accuracy: 0.9382 - val_loss: 1.5500 - val_accuracy: 0.6882
Epoch 104/200
133/133 [==============================] - 1s 7ms/step - loss: 0.3591 - accuracy: 0.9445 - val_loss: 1.7585 - val_accuracy: 0.6367
Epoch 105/200
133/133 [==============================] - 1s 7ms/step - loss: 0.3730 - accuracy: 0.9420 - val_loss: 1.5572 - val_accuracy: 0.6647
Epoch 106/200
133/133 [==============================] - 1s 7ms/step - loss: 0.3547 - accuracy: 0.9478 - val_loss: 1.4985 - val_accuracy: 0.6979
Epoch 107/200
133/133 [==============================] - 1s 7ms/step - loss: 0.3610 - accuracy: 0.9475 - val_loss: 1.5824 - val_accuracy: 0.6751
Epoch 108/200
133/133 [==============================] - 1s 7ms/step - loss: 0.3554 - accuracy: 0.9487 - val_loss: 1.5025 - val_accuracy: 0.6960
Epoch 109/200
133/133 [==============================] - 1s 7ms/step - loss: 0.3548 - accuracy: 0.9478 - val_loss: 1.8067 - val_accuracy: 0.6439
Epoch 110/200
133/133 [==============================] - 1s 7ms/step - loss: 0.3515 - accuracy: 0.9453 - val_loss: 1.7734 - val_accuracy: 0.6660
Epoch 111/200
133/133 [==============================] - 1s 7ms/step - loss: 0.3453 - accuracy: 0.9508 - val_loss: 1.9001 - val_accuracy: 0.6387
Epoch 112/200
133/133 [==============================] - 1s 7ms/step - loss: 0.3570 - accuracy: 0.9449 - val_loss: 1.4452 - val_accuracy: 0.7064
Epoch 113/200
133/133 [==============================] - 1s 7ms/step - loss: 0.3265 - accuracy: 0.9560 - val_loss: 1.5173 - val_accuracy: 0.6901
Epoch 114/200
133/133 [==============================] - 1s 7ms/step - loss: 0.3266 - accuracy: 0.9557 - val_loss: 1.5249 - val_accuracy: 0.6882
Epoch 115/200
133/133 [==============================] - 1s 7ms/step - loss: 0.3341 - accuracy: 0.9440 - val_loss: 1.6152 - val_accuracy: 0.6842
Epoch 116/200
133/133 [==============================] - 1s 7ms/step - loss: 0.3254 - accuracy: 0.9529 - val_loss: 1.6222 - val_accuracy: 0.6868
Epoch 117/200
133/133 [==============================] - 1s 7ms/step - loss: 0.3250 - accuracy: 0.9559 - val_loss: 1.4350 - val_accuracy: 0.7116
Epoch 118/200
133/133 [==============================] - 1s 7ms/step - loss: 0.3248 - accuracy: 0.9537 - val_loss: 1.5064 - val_accuracy: 0.7038
Epoch 119/200
133/133 [==============================] - 1s 7ms/step - loss: 0.3230 - accuracy: 0.9529 - val_loss: 1.7211 - val_accuracy: 0.6777
In [ ]:
_, accuracy = model_report(CNN2_MODEL_OPTIMIZED, CNN2_MODEL_OPTIMIZED_history)
accuracies_opt_64["CNN2"] = accuracy
Test set evaluation metrics
---------------------------
Loss:     1.372
Accuracy: 70.264%

Μεταφορά μάθησης

VGG16
In [ ]:
VGG16_MODEL_OPTIMIZED = init_VGG16_model_optimized(True)
VGG16_MODEL_OPTIMIZED_history = train_model(VGG16_MODEL_OPTIMIZED, epochs = 200, callbacks = [callback])
Downloading data from https://storage.googleapis.com/tensorflow/keras-applications/vgg16/vgg16_weights_tf_dim_ordering_tf_kernels_notop.h5
58892288/58889256 [==============================] - 0s 0us/step
Model: "sequential_3"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
vgg16 (Functional)           (None, 1, 1, 512)         14714688  
_________________________________________________________________
dropout_12 (Dropout)         (None, 1, 1, 512)         0         
_________________________________________________________________
global_average_pooling2d (Gl (None, 512)               0         
_________________________________________________________________
dense_6 (Dense)              (None, 20)                10260     
=================================================================
Total params: 14,724,948
Trainable params: 14,724,948
Non-trainable params: 0
_________________________________________________________________
Epoch 1/200
133/133 [==============================] - 4s 25ms/step - loss: 2.6305 - accuracy: 0.2193 - val_loss: 1.3691 - val_accuracy: 0.5918
Epoch 2/200
133/133 [==============================] - 3s 24ms/step - loss: 1.3613 - accuracy: 0.5996 - val_loss: 1.0562 - val_accuracy: 0.6901
Epoch 3/200
133/133 [==============================] - 3s 24ms/step - loss: 0.9347 - accuracy: 0.7202 - val_loss: 0.9155 - val_accuracy: 0.7311
Epoch 4/200
133/133 [==============================] - 3s 24ms/step - loss: 0.6759 - accuracy: 0.8024 - val_loss: 1.0745 - val_accuracy: 0.7038
Epoch 5/200
133/133 [==============================] - 3s 24ms/step - loss: 0.5253 - accuracy: 0.8446 - val_loss: 1.0019 - val_accuracy: 0.7298
Epoch 6/200
133/133 [==============================] - 3s 24ms/step - loss: 0.3574 - accuracy: 0.8927 - val_loss: 0.9371 - val_accuracy: 0.7702
Epoch 7/200
133/133 [==============================] - 3s 24ms/step - loss: 0.2632 - accuracy: 0.9193 - val_loss: 1.0333 - val_accuracy: 0.7402
Epoch 8/200
133/133 [==============================] - 3s 24ms/step - loss: 0.2283 - accuracy: 0.9337 - val_loss: 1.0346 - val_accuracy: 0.7578
Epoch 9/200
133/133 [==============================] - 3s 24ms/step - loss: 0.1500 - accuracy: 0.9522 - val_loss: 1.0389 - val_accuracy: 0.7604
Epoch 10/200
133/133 [==============================] - 3s 24ms/step - loss: 0.1376 - accuracy: 0.9602 - val_loss: 1.1320 - val_accuracy: 0.7624
Epoch 11/200
133/133 [==============================] - 3s 24ms/step - loss: 0.0744 - accuracy: 0.9751 - val_loss: 1.3180 - val_accuracy: 0.7214
Epoch 12/200
133/133 [==============================] - 3s 24ms/step - loss: 0.0997 - accuracy: 0.9664 - val_loss: 1.1590 - val_accuracy: 0.7441
Epoch 13/200
133/133 [==============================] - 3s 24ms/step - loss: 0.1072 - accuracy: 0.9714 - val_loss: 1.1267 - val_accuracy: 0.7689
Epoch 14/200
133/133 [==============================] - 3s 24ms/step - loss: 0.0475 - accuracy: 0.9861 - val_loss: 1.3372 - val_accuracy: 0.7591
Epoch 15/200
133/133 [==============================] - 3s 24ms/step - loss: 0.0676 - accuracy: 0.9813 - val_loss: 1.1374 - val_accuracy: 0.7572
Epoch 16/200
133/133 [==============================] - 3s 24ms/step - loss: 0.0562 - accuracy: 0.9845 - val_loss: 1.2195 - val_accuracy: 0.7585
Epoch 17/200
133/133 [==============================] - 3s 24ms/step - loss: 0.0395 - accuracy: 0.9867 - val_loss: 1.3348 - val_accuracy: 0.7474
Epoch 18/200
133/133 [==============================] - 3s 24ms/step - loss: 0.0735 - accuracy: 0.9783 - val_loss: 1.2009 - val_accuracy: 0.7708
Epoch 19/200
133/133 [==============================] - 3s 24ms/step - loss: 0.0595 - accuracy: 0.9836 - val_loss: 1.1491 - val_accuracy: 0.7663
Epoch 20/200
133/133 [==============================] - 3s 24ms/step - loss: 0.0364 - accuracy: 0.9896 - val_loss: 1.1014 - val_accuracy: 0.7715
Epoch 21/200
133/133 [==============================] - 3s 24ms/step - loss: 0.0486 - accuracy: 0.9860 - val_loss: 1.3981 - val_accuracy: 0.7487
Epoch 22/200
133/133 [==============================] - 3s 24ms/step - loss: 0.0637 - accuracy: 0.9806 - val_loss: 1.1921 - val_accuracy: 0.7780
Epoch 23/200
133/133 [==============================] - 3s 24ms/step - loss: 0.0302 - accuracy: 0.9927 - val_loss: 1.1970 - val_accuracy: 0.7656
In [ ]:
_, accuracy = model_report(VGG16_MODEL_OPTIMIZED, VGG16_MODEL_OPTIMIZED_history)
accuracies_opt_64["VGG_ALL"] = accuracy
Test set evaluation metrics
---------------------------
Loss:     0.885
Accuracy: 74.170%
MobileNet
In [ ]:
MobileNetV2_MODEL_OPTIMIZED = init_MobileNetV2_model_optimized(True)
MobileNetV2_MODEL_OPTIMIZED_history = train_model(MobileNetV2_MODEL_OPTIMIZED, train_dataset = train_ds_res, validation_dataset = validation_ds_res, epochs = 200, callbacks=[callback])
Downloading data from https://storage.googleapis.com/tensorflow/keras-applications/mobilenet_v2/mobilenet_v2_weights_tf_dim_ordering_tf_kernels_1.0_224_no_top.h5
9412608/9406464 [==============================] - 0s 0us/step
Model: "sequential_4"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
mobilenetv2_1.00_224 (Functi (None, 7, 7, 1280)        2257984   
_________________________________________________________________
dropout_13 (Dropout)         (None, 7, 7, 1280)        0         
_________________________________________________________________
global_average_pooling2d_1 ( (None, 1280)              0         
_________________________________________________________________
dense_7 (Dense)              (None, 20)                25620     
=================================================================
Total params: 2,283,604
Trainable params: 2,249,492
Non-trainable params: 34,112
_________________________________________________________________
Epoch 1/200
133/133 [==============================] - 50s 349ms/step - loss: 1.8738 - accuracy: 0.4690 - val_loss: 2.5203 - val_accuracy: 0.3874
Epoch 2/200
133/133 [==============================] - 46s 345ms/step - loss: 0.3524 - accuracy: 0.8976 - val_loss: 2.0415 - val_accuracy: 0.4824
Epoch 3/200
133/133 [==============================] - 46s 344ms/step - loss: 0.1385 - accuracy: 0.9680 - val_loss: 2.1789 - val_accuracy: 0.4701
Epoch 4/200
133/133 [==============================] - 45s 342ms/step - loss: 0.0643 - accuracy: 0.9891 - val_loss: 2.3982 - val_accuracy: 0.4408
Epoch 5/200
133/133 [==============================] - 45s 340ms/step - loss: 0.0323 - accuracy: 0.9973 - val_loss: 2.9026 - val_accuracy: 0.3932
Epoch 6/200
133/133 [==============================] - 46s 344ms/step - loss: 0.0202 - accuracy: 0.9983 - val_loss: 2.7992 - val_accuracy: 0.4049
Epoch 7/200
133/133 [==============================] - 46s 344ms/step - loss: 0.0124 - accuracy: 0.9995 - val_loss: 3.1208 - val_accuracy: 0.3516
Epoch 8/200
133/133 [==============================] - 46s 343ms/step - loss: 0.0093 - accuracy: 0.9996 - val_loss: 3.0823 - val_accuracy: 0.3652
Epoch 9/200
133/133 [==============================] - 46s 344ms/step - loss: 0.0077 - accuracy: 0.9996 - val_loss: 2.8822 - val_accuracy: 0.3958
Epoch 10/200
133/133 [==============================] - 46s 342ms/step - loss: 0.0072 - accuracy: 0.9989 - val_loss: 2.8794 - val_accuracy: 0.3984
Epoch 11/200
133/133 [==============================] - 45s 337ms/step - loss: 0.0060 - accuracy: 0.9992 - val_loss: 2.6646 - val_accuracy: 0.4160
Epoch 12/200
133/133 [==============================] - 45s 342ms/step - loss: 0.0031 - accuracy: 1.0000 - val_loss: 2.6702 - val_accuracy: 0.4238
Epoch 13/200
133/133 [==============================] - 46s 342ms/step - loss: 0.0026 - accuracy: 1.0000 - val_loss: 2.7428 - val_accuracy: 0.4212
Epoch 14/200
133/133 [==============================] - 46s 342ms/step - loss: 0.0045 - accuracy: 0.9995 - val_loss: 2.3714 - val_accuracy: 0.4759
Epoch 15/200
133/133 [==============================] - 45s 342ms/step - loss: 0.0037 - accuracy: 1.0000 - val_loss: 1.8875 - val_accuracy: 0.5638
Epoch 16/200
133/133 [==============================] - 46s 343ms/step - loss: 0.0021 - accuracy: 1.0000 - val_loss: 1.9945 - val_accuracy: 0.5625
Epoch 17/200
133/133 [==============================] - 45s 340ms/step - loss: 0.0024 - accuracy: 0.9998 - val_loss: 2.6200 - val_accuracy: 0.4557
Epoch 18/200
133/133 [==============================] - 46s 343ms/step - loss: 0.0063 - accuracy: 0.9982 - val_loss: 1.1391 - val_accuracy: 0.7155
Epoch 19/200
133/133 [==============================] - 45s 342ms/step - loss: 0.0136 - accuracy: 0.9960 - val_loss: 1.0725 - val_accuracy: 0.7357
Epoch 20/200
133/133 [==============================] - 46s 345ms/step - loss: 0.0700 - accuracy: 0.9775 - val_loss: 1.0481 - val_accuracy: 0.7630
Epoch 21/200
133/133 [==============================] - 45s 342ms/step - loss: 0.0351 - accuracy: 0.9897 - val_loss: 1.1646 - val_accuracy: 0.7526
Epoch 22/200
133/133 [==============================] - 46s 343ms/step - loss: 0.0226 - accuracy: 0.9922 - val_loss: 1.0498 - val_accuracy: 0.7630
Epoch 23/200
133/133 [==============================] - 45s 341ms/step - loss: 0.0093 - accuracy: 0.9979 - val_loss: 0.7100 - val_accuracy: 0.8333
Epoch 24/200
133/133 [==============================] - 45s 340ms/step - loss: 0.0058 - accuracy: 0.9987 - val_loss: 0.6884 - val_accuracy: 0.8444
Epoch 25/200
133/133 [==============================] - 46s 344ms/step - loss: 0.0069 - accuracy: 0.9979 - val_loss: 0.6417 - val_accuracy: 0.8464
Epoch 26/200
133/133 [==============================] - 46s 344ms/step - loss: 0.0052 - accuracy: 0.9988 - val_loss: 0.6898 - val_accuracy: 0.8392
Epoch 27/200
133/133 [==============================] - 45s 340ms/step - loss: 0.0020 - accuracy: 1.0000 - val_loss: 0.6194 - val_accuracy: 0.8691
Epoch 28/200
133/133 [==============================] - 45s 340ms/step - loss: 0.0026 - accuracy: 0.9992 - val_loss: 0.6518 - val_accuracy: 0.8620
Epoch 29/200
133/133 [==============================] - 45s 341ms/step - loss: 0.0036 - accuracy: 0.9985 - val_loss: 0.6968 - val_accuracy: 0.8464
Epoch 30/200
133/133 [==============================] - 45s 342ms/step - loss: 0.0044 - accuracy: 0.9987 - val_loss: 0.6242 - val_accuracy: 0.8659
Epoch 31/200
133/133 [==============================] - 45s 341ms/step - loss: 0.0036 - accuracy: 0.9995 - val_loss: 0.8367 - val_accuracy: 0.8320
Epoch 32/200
133/133 [==============================] - 46s 345ms/step - loss: 0.0106 - accuracy: 0.9964 - val_loss: 1.0474 - val_accuracy: 0.7936
Epoch 33/200
133/133 [==============================] - 46s 344ms/step - loss: 0.0133 - accuracy: 0.9954 - val_loss: 1.0885 - val_accuracy: 0.7871
Epoch 34/200
133/133 [==============================] - 46s 344ms/step - loss: 0.0162 - accuracy: 0.9948 - val_loss: 1.3752 - val_accuracy: 0.7598
Epoch 35/200
133/133 [==============================] - 45s 342ms/step - loss: 0.0135 - accuracy: 0.9953 - val_loss: 1.1408 - val_accuracy: 0.7826
Epoch 36/200
133/133 [==============================] - 46s 343ms/step - loss: 0.0089 - accuracy: 0.9969 - val_loss: 1.0514 - val_accuracy: 0.7910
Epoch 37/200
133/133 [==============================] - 46s 342ms/step - loss: 0.0089 - accuracy: 0.9973 - val_loss: 0.9533 - val_accuracy: 0.7969
Epoch 38/200
133/133 [==============================] - 45s 342ms/step - loss: 0.0125 - accuracy: 0.9958 - val_loss: 0.9452 - val_accuracy: 0.8034
Epoch 39/200
133/133 [==============================] - 46s 343ms/step - loss: 0.0082 - accuracy: 0.9973 - val_loss: 0.8392 - val_accuracy: 0.8353
Epoch 40/200
133/133 [==============================] - 45s 342ms/step - loss: 0.0030 - accuracy: 0.9993 - val_loss: 0.8304 - val_accuracy: 0.8359
Epoch 41/200
133/133 [==============================] - 45s 342ms/step - loss: 0.0027 - accuracy: 0.9992 - val_loss: 0.6937 - val_accuracy: 0.8607
Epoch 42/200
133/133 [==============================] - 45s 341ms/step - loss: 0.0029 - accuracy: 0.9991 - val_loss: 0.7257 - val_accuracy: 0.8717
Epoch 43/200
133/133 [==============================] - 45s 341ms/step - loss: 0.0062 - accuracy: 0.9978 - val_loss: 0.5968 - val_accuracy: 0.8783
Epoch 44/200
133/133 [==============================] - 45s 341ms/step - loss: 0.0107 - accuracy: 0.9966 - val_loss: 0.7269 - val_accuracy: 0.8509
Epoch 45/200
133/133 [==============================] - 46s 343ms/step - loss: 0.0090 - accuracy: 0.9967 - val_loss: 0.7512 - val_accuracy: 0.8483
Epoch 46/200
133/133 [==============================] - 46s 342ms/step - loss: 0.0108 - accuracy: 0.9973 - val_loss: 1.0550 - val_accuracy: 0.8118
Epoch 47/200
133/133 [==============================] - 45s 341ms/step - loss: 0.0123 - accuracy: 0.9957 - val_loss: 0.8554 - val_accuracy: 0.8288
Epoch 48/200
133/133 [==============================] - 46s 342ms/step - loss: 0.0076 - accuracy: 0.9978 - val_loss: 0.7945 - val_accuracy: 0.8340
Epoch 49/200
133/133 [==============================] - 46s 342ms/step - loss: 0.0057 - accuracy: 0.9988 - val_loss: 0.7471 - val_accuracy: 0.8503
Epoch 50/200
133/133 [==============================] - 46s 344ms/step - loss: 0.0045 - accuracy: 0.9984 - val_loss: 0.7434 - val_accuracy: 0.8600
Epoch 51/200
133/133 [==============================] - 45s 338ms/step - loss: 0.0084 - accuracy: 0.9977 - val_loss: 0.7439 - val_accuracy: 0.8542
Epoch 52/200
133/133 [==============================] - 46s 344ms/step - loss: 0.0104 - accuracy: 0.9960 - val_loss: 1.1336 - val_accuracy: 0.7917
Epoch 53/200
133/133 [==============================] - 46s 346ms/step - loss: 0.0098 - accuracy: 0.9975 - val_loss: 0.9685 - val_accuracy: 0.8242
Epoch 54/200
133/133 [==============================] - 45s 337ms/step - loss: 0.0065 - accuracy: 0.9988 - val_loss: 0.7379 - val_accuracy: 0.8470
Epoch 55/200
133/133 [==============================] - 45s 341ms/step - loss: 0.0025 - accuracy: 0.9994 - val_loss: 0.6751 - val_accuracy: 0.8730
Epoch 56/200
133/133 [==============================] - 46s 345ms/step - loss: 0.0053 - accuracy: 0.9983 - val_loss: 0.7247 - val_accuracy: 0.8535
Epoch 57/200
133/133 [==============================] - 46s 343ms/step - loss: 0.0029 - accuracy: 0.9995 - val_loss: 0.6476 - val_accuracy: 0.8757
Epoch 58/200
133/133 [==============================] - 46s 343ms/step - loss: 0.0030 - accuracy: 0.9992 - val_loss: 0.6103 - val_accuracy: 0.8789
Epoch 59/200
133/133 [==============================] - 45s 340ms/step - loss: 0.0032 - accuracy: 0.9992 - val_loss: 0.6441 - val_accuracy: 0.8646
Epoch 60/200
133/133 [==============================] - 46s 343ms/step - loss: 0.0079 - accuracy: 0.9978 - val_loss: 0.6691 - val_accuracy: 0.8672
Epoch 61/200
133/133 [==============================] - 46s 343ms/step - loss: 0.0104 - accuracy: 0.9974 - val_loss: 0.7687 - val_accuracy: 0.8470
Epoch 62/200
133/133 [==============================] - 45s 341ms/step - loss: 0.0104 - accuracy: 0.9969 - val_loss: 0.9895 - val_accuracy: 0.8236
Epoch 63/200
133/133 [==============================] - 45s 340ms/step - loss: 0.0087 - accuracy: 0.9973 - val_loss: 0.8581 - val_accuracy: 0.8392
In [ ]:
_, accuracy = model_report(MobileNetV2_MODEL_OPTIMIZED, MobileNetV2_MODEL_OPTIMIZED_history, test_ds_res)
accuracies_opt_64["MOBILENET_ALL"] = accuracy
Test set evaluation metrics
---------------------------
Loss:     0.627
Accuracy: 88.086%
DenseNet
In [ ]:
DENSENET_MODEL_OPTIMIZED = init_DENSENET_model_optimized(True)
DENSENET_MODEL_OPTIMIZED_history = train_model(DENSENET_MODEL_OPTIMIZED, epochs = 200, callbacks=[callback])
Downloading data from https://storage.googleapis.com/tensorflow/keras-applications/densenet/densenet121_weights_tf_dim_ordering_tf_kernels_notop.h5
29089792/29084464 [==============================] - 0s 0us/step
Model: "sequential_5"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
densenet121 (Functional)     (None, 1, 1, 1024)        7037504   
_________________________________________________________________
dropout_14 (Dropout)         (None, 1, 1, 1024)        0         
_________________________________________________________________
global_average_pooling2d_2 ( (None, 1024)              0         
_________________________________________________________________
dense_8 (Dense)              (None, 20)                20500     
=================================================================
Total params: 7,058,004
Trainable params: 6,974,356
Non-trainable params: 83,648
_________________________________________________________________
Epoch 1/200
133/133 [==============================] - 16s 51ms/step - loss: 3.6407 - accuracy: 0.1197 - val_loss: 2.1637 - val_accuracy: 0.3854
Epoch 2/200
133/133 [==============================] - 5s 40ms/step - loss: 1.8392 - accuracy: 0.4525 - val_loss: 1.6039 - val_accuracy: 0.5866
Epoch 3/200
133/133 [==============================] - 5s 40ms/step - loss: 1.2468 - accuracy: 0.6299 - val_loss: 1.1309 - val_accuracy: 0.6934
Epoch 4/200
133/133 [==============================] - 5s 40ms/step - loss: 0.9088 - accuracy: 0.7305 - val_loss: 0.9801 - val_accuracy: 0.7064
Epoch 5/200
133/133 [==============================] - 5s 40ms/step - loss: 0.6380 - accuracy: 0.8031 - val_loss: 0.9326 - val_accuracy: 0.7233
Epoch 6/200
133/133 [==============================] - 5s 40ms/step - loss: 0.4713 - accuracy: 0.8580 - val_loss: 0.9259 - val_accuracy: 0.7389
Epoch 7/200
133/133 [==============================] - 5s 40ms/step - loss: 0.3653 - accuracy: 0.8860 - val_loss: 0.9088 - val_accuracy: 0.7467
Epoch 8/200
133/133 [==============================] - 5s 39ms/step - loss: 0.2583 - accuracy: 0.9279 - val_loss: 0.9351 - val_accuracy: 0.7513
Epoch 9/200
133/133 [==============================] - 5s 39ms/step - loss: 0.1889 - accuracy: 0.9433 - val_loss: 0.8969 - val_accuracy: 0.7630
Epoch 10/200
133/133 [==============================] - 5s 39ms/step - loss: 0.1497 - accuracy: 0.9598 - val_loss: 0.9558 - val_accuracy: 0.7461
Epoch 11/200
133/133 [==============================] - 5s 39ms/step - loss: 0.1180 - accuracy: 0.9675 - val_loss: 0.9890 - val_accuracy: 0.7552
Epoch 12/200
133/133 [==============================] - 5s 40ms/step - loss: 0.0999 - accuracy: 0.9726 - val_loss: 0.9758 - val_accuracy: 0.7598
Epoch 13/200
133/133 [==============================] - 5s 39ms/step - loss: 0.0756 - accuracy: 0.9804 - val_loss: 1.0388 - val_accuracy: 0.7669
Epoch 14/200
133/133 [==============================] - 5s 40ms/step - loss: 0.0774 - accuracy: 0.9781 - val_loss: 1.0377 - val_accuracy: 0.7650
Epoch 15/200
133/133 [==============================] - 5s 40ms/step - loss: 0.0632 - accuracy: 0.9823 - val_loss: 1.0899 - val_accuracy: 0.7520
Epoch 16/200
133/133 [==============================] - 5s 40ms/step - loss: 0.0601 - accuracy: 0.9840 - val_loss: 1.0455 - val_accuracy: 0.7695
Epoch 17/200
133/133 [==============================] - 5s 40ms/step - loss: 0.0494 - accuracy: 0.9881 - val_loss: 1.0899 - val_accuracy: 0.7559
Epoch 18/200
133/133 [==============================] - 5s 40ms/step - loss: 0.0672 - accuracy: 0.9818 - val_loss: 1.0794 - val_accuracy: 0.7507
Epoch 19/200
133/133 [==============================] - 5s 39ms/step - loss: 0.0458 - accuracy: 0.9887 - val_loss: 1.1191 - val_accuracy: 0.7507
Epoch 20/200
133/133 [==============================] - 5s 39ms/step - loss: 0.0528 - accuracy: 0.9847 - val_loss: 1.1293 - val_accuracy: 0.7533
Epoch 21/200
133/133 [==============================] - 5s 40ms/step - loss: 0.0643 - accuracy: 0.9809 - val_loss: 1.0854 - val_accuracy: 0.7533
Epoch 22/200
133/133 [==============================] - 5s 40ms/step - loss: 0.0426 - accuracy: 0.9880 - val_loss: 1.1349 - val_accuracy: 0.7441
Epoch 23/200
133/133 [==============================] - 5s 40ms/step - loss: 0.0457 - accuracy: 0.9864 - val_loss: 1.1103 - val_accuracy: 0.7682
Epoch 24/200
133/133 [==============================] - 5s 39ms/step - loss: 0.0486 - accuracy: 0.9855 - val_loss: 1.0899 - val_accuracy: 0.7663
Epoch 25/200
133/133 [==============================] - 5s 40ms/step - loss: 0.0463 - accuracy: 0.9857 - val_loss: 1.1397 - val_accuracy: 0.7611
Epoch 26/200
133/133 [==============================] - 5s 39ms/step - loss: 0.0361 - accuracy: 0.9906 - val_loss: 1.1589 - val_accuracy: 0.7572
Epoch 27/200
133/133 [==============================] - 5s 39ms/step - loss: 0.0417 - accuracy: 0.9880 - val_loss: 1.2370 - val_accuracy: 0.7611
Epoch 28/200
133/133 [==============================] - 5s 39ms/step - loss: 0.0330 - accuracy: 0.9929 - val_loss: 1.1778 - val_accuracy: 0.7637
Epoch 29/200
133/133 [==============================] - 5s 40ms/step - loss: 0.0377 - accuracy: 0.9893 - val_loss: 1.1689 - val_accuracy: 0.7546
In [ ]:
_, accuracy = model_report(DENSENET_MODEL_OPTIMIZED, DENSENET_MODEL_OPTIMIZED_history)
accuracies_opt_64["DENSENET_ALL"] = accuracy
Test set evaluation metrics
---------------------------
Loss:     0.917
Accuracy: 75.977%

Batch size = 128

In [ ]:
BATCH_SIZE = 128

def _input_fn(x,y, BATCH_SIZE):
  ds = tf.data.Dataset.from_tensor_slices((x,y))
  ds = ds.shuffle(buffer_size=data_size)
  ds = ds.repeat()
  ds = ds.batch(BATCH_SIZE)
  ds = ds.prefetch(buffer_size=AUTOTUNE)
  return ds

train_ds =_input_fn(x_train,y_train, BATCH_SIZE) #PrefetchDataset object
validation_ds =_input_fn(x_val,y_val, BATCH_SIZE) #PrefetchDataset object
test_ds =_input_fn(x_test,y_test, BATCH_SIZE) #PrefetchDataset object

train_ds_res = train_ds.map(resize_transform)
validation_ds_res = validation_ds.map(resize_transform)
test_ds_res = test_ds.map(resize_transform)

def train_model(model, train_dataset = train_ds, validation_dataset = validation_ds, epochs = 100, callbacks = None, steps_per_epoch = int(np.ceil(x_train.shape[0]/BATCH_SIZE)), validation_steps = int(np.ceil(x_val.shape[0]/BATCH_SIZE))):
  history = model.fit(train_dataset, epochs=epochs, steps_per_epoch=steps_per_epoch, validation_data=validation_dataset, validation_steps=validation_steps, callbacks=callbacks)
  return(history)

def model_report(model, history, evaluation_dataset = test_ds, evaluation_steps = int(np.ceil(x_test.shape[0]/BATCH_SIZE))):
      plt = summarize_diagnostics(history)
      plt.show()
      return model_evaluation(model, evaluation_dataset, evaluation_steps)

Δίκτυα "from scratch"

In [ ]:
accuracies_opt_128 = {}
Simple CNN
In [ ]:
SIMPLE_MODEL_OPTIMIZED = init_simple_model_optimized(summary = True)
SIMPLE_MODEL_OPTIMIZED_history = train_model(SIMPLE_MODEL_OPTIMIZED, epochs = 200, callbacks=[callback])
Model: "sequential_6"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
conv2d_10 (Conv2D)           (None, 30, 30, 32)        896       
_________________________________________________________________
batch_normalization_10 (Batc (None, 30, 30, 32)        128       
_________________________________________________________________
re_lu_10 (ReLU)              (None, 30, 30, 32)        0         
_________________________________________________________________
max_pooling2d_7 (MaxPooling2 (None, 15, 15, 32)        0         
_________________________________________________________________
dropout_15 (Dropout)         (None, 15, 15, 32)        0         
_________________________________________________________________
conv2d_11 (Conv2D)           (None, 13, 13, 64)        18496     
_________________________________________________________________
batch_normalization_11 (Batc (None, 13, 13, 64)        256       
_________________________________________________________________
re_lu_11 (ReLU)              (None, 13, 13, 64)        0         
_________________________________________________________________
max_pooling2d_8 (MaxPooling2 (None, 6, 6, 64)          0         
_________________________________________________________________
dropout_16 (Dropout)         (None, 6, 6, 64)          0         
_________________________________________________________________
conv2d_12 (Conv2D)           (None, 4, 4, 64)          36928     
_________________________________________________________________
batch_normalization_12 (Batc (None, 4, 4, 64)          256       
_________________________________________________________________
re_lu_12 (ReLU)              (None, 4, 4, 64)          0         
_________________________________________________________________
flatten_3 (Flatten)          (None, 1024)              0         
_________________________________________________________________
dropout_17 (Dropout)         (None, 1024)              0         
_________________________________________________________________
dense_9 (Dense)              (None, 64)                65600     
_________________________________________________________________
dense_10 (Dense)             (None, 20)                1300      
=================================================================
Total params: 123,860
Trainable params: 123,540
Non-trainable params: 320
_________________________________________________________________
Epoch 1/200
67/67 [==============================] - 1s 10ms/step - loss: 4.3767 - accuracy: 0.0636 - val_loss: 4.1052 - val_accuracy: 0.0618
Epoch 2/200
67/67 [==============================] - 1s 8ms/step - loss: 3.9787 - accuracy: 0.1261 - val_loss: 4.1613 - val_accuracy: 0.0586
Epoch 3/200
67/67 [==============================] - 1s 8ms/step - loss: 3.8172 - accuracy: 0.1709 - val_loss: 4.2142 - val_accuracy: 0.0710
Epoch 4/200
67/67 [==============================] - 1s 8ms/step - loss: 3.6714 - accuracy: 0.2056 - val_loss: 4.2429 - val_accuracy: 0.0866
Epoch 5/200
67/67 [==============================] - 1s 8ms/step - loss: 3.5419 - accuracy: 0.2215 - val_loss: 4.1879 - val_accuracy: 0.1022
Epoch 6/200
67/67 [==============================] - 1s 8ms/step - loss: 3.4290 - accuracy: 0.2581 - val_loss: 4.0531 - val_accuracy: 0.1204
Epoch 7/200
67/67 [==============================] - 1s 8ms/step - loss: 3.3393 - accuracy: 0.2781 - val_loss: 3.8107 - val_accuracy: 0.1549
Epoch 8/200
67/67 [==============================] - 1s 8ms/step - loss: 3.2688 - accuracy: 0.2870 - val_loss: 3.6079 - val_accuracy: 0.1921
Epoch 9/200
67/67 [==============================] - 1s 8ms/step - loss: 3.1707 - accuracy: 0.3156 - val_loss: 3.3946 - val_accuracy: 0.2435
Epoch 10/200
67/67 [==============================] - 1s 8ms/step - loss: 3.0749 - accuracy: 0.3300 - val_loss: 3.1782 - val_accuracy: 0.2826
Epoch 11/200
67/67 [==============================] - 1s 8ms/step - loss: 2.9719 - accuracy: 0.3565 - val_loss: 3.0465 - val_accuracy: 0.3262
Epoch 12/200
67/67 [==============================] - 1s 8ms/step - loss: 2.9081 - accuracy: 0.3691 - val_loss: 3.0056 - val_accuracy: 0.3288
Epoch 13/200
67/67 [==============================] - 1s 8ms/step - loss: 2.8504 - accuracy: 0.3733 - val_loss: 2.9356 - val_accuracy: 0.3542
Epoch 14/200
67/67 [==============================] - 1s 8ms/step - loss: 2.7794 - accuracy: 0.3935 - val_loss: 2.8191 - val_accuracy: 0.3789
Epoch 15/200
67/67 [==============================] - 1s 8ms/step - loss: 2.7243 - accuracy: 0.4012 - val_loss: 2.8661 - val_accuracy: 0.3639
Epoch 16/200
67/67 [==============================] - 1s 8ms/step - loss: 2.6562 - accuracy: 0.4112 - val_loss: 2.7590 - val_accuracy: 0.3757
Epoch 17/200
67/67 [==============================] - 1s 8ms/step - loss: 2.6050 - accuracy: 0.4277 - val_loss: 2.7306 - val_accuracy: 0.3822
Epoch 18/200
67/67 [==============================] - 1s 8ms/step - loss: 2.5638 - accuracy: 0.4346 - val_loss: 2.6409 - val_accuracy: 0.4023
Epoch 19/200
67/67 [==============================] - 1s 8ms/step - loss: 2.5104 - accuracy: 0.4446 - val_loss: 2.6981 - val_accuracy: 0.3913
Epoch 20/200
67/67 [==============================] - 1s 8ms/step - loss: 2.4461 - accuracy: 0.4627 - val_loss: 2.7450 - val_accuracy: 0.3757
Epoch 21/200
67/67 [==============================] - 1s 8ms/step - loss: 2.3653 - accuracy: 0.4800 - val_loss: 2.5780 - val_accuracy: 0.4049
Epoch 22/200
67/67 [==============================] - 1s 8ms/step - loss: 2.3606 - accuracy: 0.4696 - val_loss: 2.5993 - val_accuracy: 0.4095
Epoch 23/200
67/67 [==============================] - 1s 9ms/step - loss: 2.3256 - accuracy: 0.4770 - val_loss: 2.7853 - val_accuracy: 0.3672
Epoch 24/200
67/67 [==============================] - 1s 8ms/step - loss: 2.2562 - accuracy: 0.4991 - val_loss: 2.5049 - val_accuracy: 0.4290
Epoch 25/200
67/67 [==============================] - 1s 8ms/step - loss: 2.2375 - accuracy: 0.4932 - val_loss: 2.4714 - val_accuracy: 0.4355
Epoch 26/200
67/67 [==============================] - 1s 8ms/step - loss: 2.1750 - accuracy: 0.5028 - val_loss: 2.3436 - val_accuracy: 0.4616
Epoch 27/200
67/67 [==============================] - 1s 8ms/step - loss: 2.1560 - accuracy: 0.5112 - val_loss: 2.4647 - val_accuracy: 0.4388
Epoch 28/200
67/67 [==============================] - 1s 8ms/step - loss: 2.1045 - accuracy: 0.5177 - val_loss: 2.4122 - val_accuracy: 0.4447
Epoch 29/200
67/67 [==============================] - 1s 8ms/step - loss: 2.0927 - accuracy: 0.5141 - val_loss: 2.5421 - val_accuracy: 0.4199
Epoch 30/200
67/67 [==============================] - 1s 8ms/step - loss: 2.0615 - accuracy: 0.5256 - val_loss: 2.3616 - val_accuracy: 0.4518
Epoch 31/200
67/67 [==============================] - 1s 8ms/step - loss: 2.0273 - accuracy: 0.5301 - val_loss: 2.2039 - val_accuracy: 0.4863
Epoch 32/200
67/67 [==============================] - 1s 8ms/step - loss: 1.9841 - accuracy: 0.5470 - val_loss: 2.2466 - val_accuracy: 0.4792
Epoch 33/200
67/67 [==============================] - 1s 8ms/step - loss: 1.9607 - accuracy: 0.5429 - val_loss: 2.2529 - val_accuracy: 0.4701
Epoch 34/200
67/67 [==============================] - 1s 8ms/step - loss: 1.9481 - accuracy: 0.5599 - val_loss: 2.5165 - val_accuracy: 0.4290
Epoch 35/200
67/67 [==============================] - 1s 8ms/step - loss: 1.9197 - accuracy: 0.5536 - val_loss: 2.4805 - val_accuracy: 0.4238
Epoch 36/200
67/67 [==============================] - 1s 8ms/step - loss: 1.8956 - accuracy: 0.5599 - val_loss: 2.2578 - val_accuracy: 0.4668
Epoch 37/200
67/67 [==============================] - 1s 8ms/step - loss: 1.8686 - accuracy: 0.5660 - val_loss: 2.2378 - val_accuracy: 0.4824
Epoch 38/200
67/67 [==============================] - 1s 8ms/step - loss: 1.8223 - accuracy: 0.5734 - val_loss: 2.2305 - val_accuracy: 0.4759
Epoch 39/200
67/67 [==============================] - 1s 8ms/step - loss: 1.8075 - accuracy: 0.5752 - val_loss: 2.1716 - val_accuracy: 0.4941
Epoch 40/200
67/67 [==============================] - 1s 8ms/step - loss: 1.7931 - accuracy: 0.5758 - val_loss: 2.1288 - val_accuracy: 0.4909
Epoch 41/200
67/67 [==============================] - 1s 8ms/step - loss: 1.7681 - accuracy: 0.5806 - val_loss: 2.2999 - val_accuracy: 0.4609
Epoch 42/200
67/67 [==============================] - 1s 8ms/step - loss: 1.7317 - accuracy: 0.5878 - val_loss: 2.0574 - val_accuracy: 0.5065
Epoch 43/200
67/67 [==============================] - 1s 8ms/step - loss: 1.7246 - accuracy: 0.5881 - val_loss: 2.0827 - val_accuracy: 0.5137
Epoch 44/200
67/67 [==============================] - 1s 8ms/step - loss: 1.7116 - accuracy: 0.5925 - val_loss: 2.1137 - val_accuracy: 0.4948
Epoch 45/200
67/67 [==============================] - 1s 8ms/step - loss: 1.6816 - accuracy: 0.6044 - val_loss: 2.1003 - val_accuracy: 0.4876
Epoch 46/200
67/67 [==============================] - 1s 8ms/step - loss: 1.6570 - accuracy: 0.6089 - val_loss: 2.0609 - val_accuracy: 0.5111
Epoch 47/200
67/67 [==============================] - 1s 8ms/step - loss: 1.6524 - accuracy: 0.6055 - val_loss: 1.8986 - val_accuracy: 0.5430
Epoch 48/200
67/67 [==============================] - 1s 8ms/step - loss: 1.6538 - accuracy: 0.6006 - val_loss: 2.1373 - val_accuracy: 0.4889
Epoch 49/200
67/67 [==============================] - 1s 8ms/step - loss: 1.6007 - accuracy: 0.6152 - val_loss: 1.9526 - val_accuracy: 0.5176
Epoch 50/200
67/67 [==============================] - 1s 8ms/step - loss: 1.5894 - accuracy: 0.6185 - val_loss: 2.0831 - val_accuracy: 0.4902
Epoch 51/200
67/67 [==============================] - 1s 8ms/step - loss: 1.5751 - accuracy: 0.6272 - val_loss: 1.9681 - val_accuracy: 0.5124
Epoch 52/200
67/67 [==============================] - 1s 8ms/step - loss: 1.5358 - accuracy: 0.6202 - val_loss: 2.0638 - val_accuracy: 0.5098
Epoch 53/200
67/67 [==============================] - 1s 8ms/step - loss: 1.5645 - accuracy: 0.6222 - val_loss: 1.9988 - val_accuracy: 0.5124
Epoch 54/200
67/67 [==============================] - 1s 8ms/step - loss: 1.4951 - accuracy: 0.6470 - val_loss: 1.9983 - val_accuracy: 0.5098
Epoch 55/200
67/67 [==============================] - 1s 8ms/step - loss: 1.5134 - accuracy: 0.6259 - val_loss: 1.9360 - val_accuracy: 0.5293
Epoch 56/200
67/67 [==============================] - 1s 8ms/step - loss: 1.4943 - accuracy: 0.6378 - val_loss: 1.9892 - val_accuracy: 0.5052
Epoch 57/200
67/67 [==============================] - 1s 9ms/step - loss: 1.4814 - accuracy: 0.6432 - val_loss: 2.1406 - val_accuracy: 0.4759
Epoch 58/200
67/67 [==============================] - 1s 9ms/step - loss: 1.4731 - accuracy: 0.6424 - val_loss: 1.8854 - val_accuracy: 0.5293
Epoch 59/200
67/67 [==============================] - 1s 8ms/step - loss: 1.4624 - accuracy: 0.6459 - val_loss: 1.9238 - val_accuracy: 0.5260
Epoch 60/200
67/67 [==============================] - 1s 8ms/step - loss: 1.4388 - accuracy: 0.6516 - val_loss: 1.9774 - val_accuracy: 0.5046
Epoch 61/200
67/67 [==============================] - 1s 8ms/step - loss: 1.4137 - accuracy: 0.6546 - val_loss: 1.9083 - val_accuracy: 0.5449
Epoch 62/200
67/67 [==============================] - 1s 8ms/step - loss: 1.3984 - accuracy: 0.6569 - val_loss: 1.8173 - val_accuracy: 0.5495
Epoch 63/200
67/67 [==============================] - 1s 8ms/step - loss: 1.3775 - accuracy: 0.6592 - val_loss: 1.7961 - val_accuracy: 0.5501
Epoch 64/200
67/67 [==============================] - 1s 8ms/step - loss: 1.3816 - accuracy: 0.6583 - val_loss: 1.9833 - val_accuracy: 0.5124
Epoch 65/200
67/67 [==============================] - 1s 8ms/step - loss: 1.3746 - accuracy: 0.6595 - val_loss: 1.8478 - val_accuracy: 0.5495
Epoch 66/200
67/67 [==============================] - 1s 8ms/step - loss: 1.3501 - accuracy: 0.6598 - val_loss: 1.7184 - val_accuracy: 0.5853
Epoch 67/200
67/67 [==============================] - 1s 8ms/step - loss: 1.3696 - accuracy: 0.6607 - val_loss: 1.8798 - val_accuracy: 0.5345
Epoch 68/200
67/67 [==============================] - 1s 8ms/step - loss: 1.3322 - accuracy: 0.6673 - val_loss: 1.8543 - val_accuracy: 0.5352
Epoch 69/200
67/67 [==============================] - 1s 8ms/step - loss: 1.3201 - accuracy: 0.6672 - val_loss: 1.7317 - val_accuracy: 0.5690
Epoch 70/200
67/67 [==============================] - 1s 8ms/step - loss: 1.2877 - accuracy: 0.6822 - val_loss: 1.8284 - val_accuracy: 0.5462
Epoch 71/200
67/67 [==============================] - 1s 8ms/step - loss: 1.2734 - accuracy: 0.6873 - val_loss: 1.7143 - val_accuracy: 0.5801
Epoch 72/200
67/67 [==============================] - 1s 8ms/step - loss: 1.2889 - accuracy: 0.6810 - val_loss: 1.8024 - val_accuracy: 0.5579
Epoch 73/200
67/67 [==============================] - 1s 8ms/step - loss: 1.2607 - accuracy: 0.6842 - val_loss: 1.7619 - val_accuracy: 0.5605
Epoch 74/200
67/67 [==============================] - 1s 8ms/step - loss: 1.2582 - accuracy: 0.6894 - val_loss: 1.8076 - val_accuracy: 0.5410
Epoch 75/200
67/67 [==============================] - 1s 8ms/step - loss: 1.2105 - accuracy: 0.7010 - val_loss: 1.7146 - val_accuracy: 0.5605
Epoch 76/200
67/67 [==============================] - 1s 8ms/step - loss: 1.2525 - accuracy: 0.6810 - val_loss: 1.7779 - val_accuracy: 0.5436
Epoch 77/200
67/67 [==============================] - 1s 8ms/step - loss: 1.2344 - accuracy: 0.6920 - val_loss: 1.7298 - val_accuracy: 0.5742
Epoch 78/200
67/67 [==============================] - 1s 8ms/step - loss: 1.2028 - accuracy: 0.6982 - val_loss: 1.7555 - val_accuracy: 0.5677
Epoch 79/200
67/67 [==============================] - 1s 8ms/step - loss: 1.1753 - accuracy: 0.7075 - val_loss: 1.6954 - val_accuracy: 0.5716
Epoch 80/200
67/67 [==============================] - 1s 8ms/step - loss: 1.2014 - accuracy: 0.7037 - val_loss: 1.6497 - val_accuracy: 0.5794
Epoch 81/200
67/67 [==============================] - 1s 8ms/step - loss: 1.1965 - accuracy: 0.6968 - val_loss: 1.6395 - val_accuracy: 0.5827
Epoch 82/200
67/67 [==============================] - 1s 9ms/step - loss: 1.1630 - accuracy: 0.7057 - val_loss: 1.7831 - val_accuracy: 0.5521
Epoch 83/200
67/67 [==============================] - 1s 8ms/step - loss: 1.1687 - accuracy: 0.7021 - val_loss: 1.6804 - val_accuracy: 0.5697
Epoch 84/200
67/67 [==============================] - 1s 8ms/step - loss: 1.1917 - accuracy: 0.6975 - val_loss: 1.7776 - val_accuracy: 0.5645
Epoch 85/200
67/67 [==============================] - 1s 8ms/step - loss: 1.1467 - accuracy: 0.7027 - val_loss: 1.5992 - val_accuracy: 0.5944
Epoch 86/200
67/67 [==============================] - 1s 8ms/step - loss: 1.1100 - accuracy: 0.7181 - val_loss: 1.7115 - val_accuracy: 0.5710
Epoch 87/200
67/67 [==============================] - 1s 9ms/step - loss: 1.1318 - accuracy: 0.7090 - val_loss: 1.6168 - val_accuracy: 0.5996
Epoch 88/200
67/67 [==============================] - 1s 8ms/step - loss: 1.1141 - accuracy: 0.7156 - val_loss: 1.5470 - val_accuracy: 0.6120
Epoch 89/200
67/67 [==============================] - 1s 8ms/step - loss: 1.1083 - accuracy: 0.7177 - val_loss: 1.7775 - val_accuracy: 0.5599
Epoch 90/200
67/67 [==============================] - 1s 8ms/step - loss: 1.0875 - accuracy: 0.7195 - val_loss: 1.5216 - val_accuracy: 0.6270
Epoch 91/200
67/67 [==============================] - 1s 8ms/step - loss: 1.0909 - accuracy: 0.7258 - val_loss: 1.7149 - val_accuracy: 0.5671
Epoch 92/200
67/67 [==============================] - 1s 8ms/step - loss: 1.0867 - accuracy: 0.7203 - val_loss: 1.5729 - val_accuracy: 0.6022
Epoch 93/200
67/67 [==============================] - 1s 8ms/step - loss: 1.0646 - accuracy: 0.7292 - val_loss: 1.5059 - val_accuracy: 0.6328
Epoch 94/200
67/67 [==============================] - 1s 9ms/step - loss: 1.0849 - accuracy: 0.7192 - val_loss: 1.5242 - val_accuracy: 0.6204
Epoch 95/200
67/67 [==============================] - 1s 8ms/step - loss: 1.0599 - accuracy: 0.7231 - val_loss: 1.6215 - val_accuracy: 0.5924
Epoch 96/200
67/67 [==============================] - 1s 8ms/step - loss: 1.0318 - accuracy: 0.7338 - val_loss: 1.5924 - val_accuracy: 0.6068
Epoch 97/200
67/67 [==============================] - 1s 8ms/step - loss: 1.0462 - accuracy: 0.7319 - val_loss: 1.4726 - val_accuracy: 0.6289
Epoch 98/200
67/67 [==============================] - 1s 8ms/step - loss: 1.0195 - accuracy: 0.7437 - val_loss: 1.5815 - val_accuracy: 0.6035
Epoch 99/200
67/67 [==============================] - 1s 8ms/step - loss: 1.0000 - accuracy: 0.7513 - val_loss: 1.5671 - val_accuracy: 0.6035
Epoch 100/200
67/67 [==============================] - 1s 8ms/step - loss: 1.0278 - accuracy: 0.7416 - val_loss: 1.4628 - val_accuracy: 0.6328
Epoch 101/200
67/67 [==============================] - 1s 8ms/step - loss: 1.0083 - accuracy: 0.7445 - val_loss: 1.4910 - val_accuracy: 0.6243
Epoch 102/200
67/67 [==============================] - 1s 8ms/step - loss: 1.0050 - accuracy: 0.7392 - val_loss: 1.4038 - val_accuracy: 0.6536
Epoch 103/200
67/67 [==============================] - 1s 8ms/step - loss: 0.9948 - accuracy: 0.7470 - val_loss: 1.4582 - val_accuracy: 0.6348
Epoch 104/200
67/67 [==============================] - 1s 8ms/step - loss: 0.9862 - accuracy: 0.7543 - val_loss: 1.4877 - val_accuracy: 0.6230
Epoch 105/200
67/67 [==============================] - 1s 8ms/step - loss: 0.9804 - accuracy: 0.7483 - val_loss: 1.4733 - val_accuracy: 0.6243
Epoch 106/200
67/67 [==============================] - 1s 8ms/step - loss: 0.9647 - accuracy: 0.7565 - val_loss: 1.6473 - val_accuracy: 0.5827
Epoch 107/200
67/67 [==============================] - 1s 8ms/step - loss: 0.9444 - accuracy: 0.7539 - val_loss: 1.5941 - val_accuracy: 0.6042
Epoch 108/200
67/67 [==============================] - 1s 8ms/step - loss: 0.9581 - accuracy: 0.7581 - val_loss: 1.4117 - val_accuracy: 0.6504
Epoch 109/200
67/67 [==============================] - 1s 8ms/step - loss: 0.9444 - accuracy: 0.7530 - val_loss: 1.4534 - val_accuracy: 0.6341
Epoch 110/200
67/67 [==============================] - 1s 8ms/step - loss: 0.9153 - accuracy: 0.7607 - val_loss: 1.6184 - val_accuracy: 0.6055
Epoch 111/200
67/67 [==============================] - 1s 8ms/step - loss: 0.9350 - accuracy: 0.7524 - val_loss: 1.4209 - val_accuracy: 0.6445
Epoch 112/200
67/67 [==============================] - 1s 8ms/step - loss: 0.9296 - accuracy: 0.7613 - val_loss: 1.3762 - val_accuracy: 0.6602
Epoch 113/200
67/67 [==============================] - 1s 9ms/step - loss: 0.8957 - accuracy: 0.7700 - val_loss: 1.4728 - val_accuracy: 0.6217
Epoch 114/200
67/67 [==============================] - 1s 8ms/step - loss: 0.9010 - accuracy: 0.7716 - val_loss: 1.3756 - val_accuracy: 0.6615
Epoch 115/200
67/67 [==============================] - 1s 8ms/step - loss: 0.8969 - accuracy: 0.7689 - val_loss: 1.3634 - val_accuracy: 0.6615
Epoch 116/200
67/67 [==============================] - 1s 8ms/step - loss: 0.8945 - accuracy: 0.7750 - val_loss: 1.5493 - val_accuracy: 0.6126
Epoch 117/200
67/67 [==============================] - 1s 8ms/step - loss: 0.8930 - accuracy: 0.7635 - val_loss: 1.4778 - val_accuracy: 0.6289
Epoch 118/200
67/67 [==============================] - 1s 8ms/step - loss: 0.9034 - accuracy: 0.7604 - val_loss: 1.4564 - val_accuracy: 0.6289
Epoch 119/200
67/67 [==============================] - 1s 8ms/step - loss: 0.8737 - accuracy: 0.7753 - val_loss: 1.4416 - val_accuracy: 0.6309
Epoch 120/200
67/67 [==============================] - 1s 8ms/step - loss: 0.8776 - accuracy: 0.7703 - val_loss: 1.4129 - val_accuracy: 0.6471
Epoch 121/200
67/67 [==============================] - 1s 8ms/step - loss: 0.8636 - accuracy: 0.7812 - val_loss: 1.4889 - val_accuracy: 0.6289
Epoch 122/200
67/67 [==============================] - 1s 8ms/step - loss: 0.8792 - accuracy: 0.7702 - val_loss: 1.4553 - val_accuracy: 0.6296
Epoch 123/200
67/67 [==============================] - 1s 8ms/step - loss: 0.8676 - accuracy: 0.7752 - val_loss: 1.5470 - val_accuracy: 0.6139
Epoch 124/200
67/67 [==============================] - 1s 8ms/step - loss: 0.8921 - accuracy: 0.7643 - val_loss: 1.3383 - val_accuracy: 0.6530
Epoch 125/200
67/67 [==============================] - 1s 8ms/step - loss: 0.8508 - accuracy: 0.7806 - val_loss: 1.3713 - val_accuracy: 0.6484
Epoch 126/200
67/67 [==============================] - 1s 8ms/step - loss: 0.8602 - accuracy: 0.7686 - val_loss: 1.4265 - val_accuracy: 0.6328
Epoch 127/200
67/67 [==============================] - 1s 9ms/step - loss: 0.8643 - accuracy: 0.7747 - val_loss: 1.4090 - val_accuracy: 0.6497
Epoch 128/200
67/67 [==============================] - 1s 8ms/step - loss: 0.8346 - accuracy: 0.7835 - val_loss: 1.4491 - val_accuracy: 0.6289
Epoch 129/200
67/67 [==============================] - 1s 8ms/step - loss: 0.8302 - accuracy: 0.7848 - val_loss: 1.4487 - val_accuracy: 0.6387
Epoch 130/200
67/67 [==============================] - 1s 8ms/step - loss: 0.8251 - accuracy: 0.7931 - val_loss: 1.3700 - val_accuracy: 0.6536
Epoch 131/200
67/67 [==============================] - 1s 8ms/step - loss: 0.8298 - accuracy: 0.7843 - val_loss: 1.4856 - val_accuracy: 0.6393
Epoch 132/200
67/67 [==============================] - 1s 8ms/step - loss: 0.7965 - accuracy: 0.7944 - val_loss: 1.3251 - val_accuracy: 0.6673
Epoch 133/200
67/67 [==============================] - 1s 8ms/step - loss: 0.8278 - accuracy: 0.7877 - val_loss: 1.4794 - val_accuracy: 0.6309
Epoch 134/200
67/67 [==============================] - 1s 8ms/step - loss: 0.8234 - accuracy: 0.7885 - val_loss: 1.3919 - val_accuracy: 0.6523
Epoch 135/200
67/67 [==============================] - 1s 9ms/step - loss: 0.7961 - accuracy: 0.7964 - val_loss: 1.3820 - val_accuracy: 0.6491
Epoch 136/200
67/67 [==============================] - 1s 8ms/step - loss: 0.7927 - accuracy: 0.7956 - val_loss: 1.5385 - val_accuracy: 0.6191
Epoch 137/200
67/67 [==============================] - 1s 8ms/step - loss: 0.7841 - accuracy: 0.7927 - val_loss: 1.4289 - val_accuracy: 0.6354
Epoch 138/200
67/67 [==============================] - 1s 8ms/step - loss: 0.7921 - accuracy: 0.7945 - val_loss: 1.4095 - val_accuracy: 0.6367
Epoch 139/200
67/67 [==============================] - 1s 8ms/step - loss: 0.7677 - accuracy: 0.8015 - val_loss: 1.4257 - val_accuracy: 0.6367
Epoch 140/200
67/67 [==============================] - 1s 8ms/step - loss: 0.7805 - accuracy: 0.7942 - val_loss: 1.3991 - val_accuracy: 0.6504
Epoch 141/200
67/67 [==============================] - 1s 8ms/step - loss: 0.7862 - accuracy: 0.7934 - val_loss: 1.4384 - val_accuracy: 0.6283
Epoch 142/200
67/67 [==============================] - 1s 8ms/step - loss: 0.7701 - accuracy: 0.8003 - val_loss: 1.3901 - val_accuracy: 0.6452
Epoch 143/200
67/67 [==============================] - 1s 8ms/step - loss: 0.7723 - accuracy: 0.7995 - val_loss: 1.4230 - val_accuracy: 0.6406
Epoch 144/200
67/67 [==============================] - 1s 8ms/step - loss: 0.7620 - accuracy: 0.7971 - val_loss: 1.3483 - val_accuracy: 0.6602
Epoch 145/200
67/67 [==============================] - 1s 8ms/step - loss: 0.7546 - accuracy: 0.8019 - val_loss: 1.3739 - val_accuracy: 0.6484
Epoch 146/200
67/67 [==============================] - 1s 8ms/step - loss: 0.7751 - accuracy: 0.7999 - val_loss: 1.3810 - val_accuracy: 0.6602
Epoch 147/200
67/67 [==============================] - 1s 9ms/step - loss: 0.7483 - accuracy: 0.8042 - val_loss: 1.4483 - val_accuracy: 0.6361
Epoch 148/200
67/67 [==============================] - 1s 8ms/step - loss: 0.7541 - accuracy: 0.7987 - val_loss: 1.4993 - val_accuracy: 0.6263
Epoch 149/200
67/67 [==============================] - 1s 8ms/step - loss: 0.7326 - accuracy: 0.8069 - val_loss: 1.3322 - val_accuracy: 0.6641
Epoch 150/200
67/67 [==============================] - 1s 8ms/step - loss: 0.7290 - accuracy: 0.8054 - val_loss: 1.3682 - val_accuracy: 0.6615
Epoch 151/200
67/67 [==============================] - 1s 8ms/step - loss: 0.7164 - accuracy: 0.8104 - val_loss: 1.4095 - val_accuracy: 0.6484
Epoch 152/200
67/67 [==============================] - 1s 8ms/step - loss: 0.7236 - accuracy: 0.8100 - val_loss: 1.3792 - val_accuracy: 0.6478
In [ ]:
_, accuracy = model_report(SIMPLE_MODEL_OPTIMIZED, SIMPLE_MODEL_OPTIMIZED_history)
accuracies_opt_128["SIMPLE_MODEL"] = accuracy
Test set evaluation metrics
---------------------------
Loss:     1.342
Accuracy: 65.820%
CNN1
In [ ]:
CNN1_MODEL_OPTIMIZED = init_cnn1_model_optimized(summary = True)
CNN1_MODEL_OPTIMIZED_history = train_model(CNN1_MODEL_OPTIMIZED, epochs = 200, callbacks=[callback])
Model: "sequential_7"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
conv2d_13 (Conv2D)           (None, 30, 30, 32)        896       
_________________________________________________________________
batch_normalization_13 (Batc (None, 30, 30, 32)        128       
_________________________________________________________________
re_lu_13 (ReLU)              (None, 30, 30, 32)        0         
_________________________________________________________________
max_pooling2d_9 (MaxPooling2 (None, 15, 15, 32)        0         
_________________________________________________________________
dropout_18 (Dropout)         (None, 15, 15, 32)        0         
_________________________________________________________________
conv2d_14 (Conv2D)           (None, 13, 13, 64)        18496     
_________________________________________________________________
batch_normalization_14 (Batc (None, 13, 13, 64)        256       
_________________________________________________________________
re_lu_14 (ReLU)              (None, 13, 13, 64)        0         
_________________________________________________________________
max_pooling2d_10 (MaxPooling (None, 6, 6, 64)          0         
_________________________________________________________________
dropout_19 (Dropout)         (None, 6, 6, 64)          0         
_________________________________________________________________
conv2d_15 (Conv2D)           (None, 4, 4, 128)         73856     
_________________________________________________________________
batch_normalization_15 (Batc (None, 4, 4, 128)         512       
_________________________________________________________________
re_lu_15 (ReLU)              (None, 4, 4, 128)         0         
_________________________________________________________________
average_pooling2d_1 (Average (None, 2, 2, 128)         0         
_________________________________________________________________
dropout_20 (Dropout)         (None, 2, 2, 128)         0         
_________________________________________________________________
flatten_4 (Flatten)          (None, 512)               0         
_________________________________________________________________
dense_11 (Dense)             (None, 1024)              525312    
_________________________________________________________________
dropout_21 (Dropout)         (None, 1024)              0         
_________________________________________________________________
dense_12 (Dense)             (None, 20)                20500     
=================================================================
Total params: 639,956
Trainable params: 639,508
Non-trainable params: 448
_________________________________________________________________
Epoch 1/200
67/67 [==============================] - 1s 11ms/step - loss: 4.3439 - accuracy: 0.0855 - val_loss: 4.2880 - val_accuracy: 0.0651
Epoch 2/200
67/67 [==============================] - 1s 8ms/step - loss: 3.8574 - accuracy: 0.2149 - val_loss: 4.4538 - val_accuracy: 0.0820
Epoch 3/200
67/67 [==============================] - 1s 9ms/step - loss: 3.5906 - accuracy: 0.2779 - val_loss: 4.6666 - val_accuracy: 0.0801
Epoch 4/200
67/67 [==============================] - 1s 8ms/step - loss: 3.3919 - accuracy: 0.3127 - val_loss: 4.7394 - val_accuracy: 0.0768
Epoch 5/200
67/67 [==============================] - 1s 8ms/step - loss: 3.2012 - accuracy: 0.3476 - val_loss: 4.4618 - val_accuracy: 0.1048
Epoch 6/200
67/67 [==============================] - 1s 8ms/step - loss: 3.0356 - accuracy: 0.3852 - val_loss: 4.1658 - val_accuracy: 0.1224
Epoch 7/200
67/67 [==============================] - 1s 8ms/step - loss: 2.9339 - accuracy: 0.4131 - val_loss: 3.7024 - val_accuracy: 0.1908
Epoch 8/200
67/67 [==============================] - 1s 8ms/step - loss: 2.8211 - accuracy: 0.4298 - val_loss: 3.3436 - val_accuracy: 0.2611
Epoch 9/200
67/67 [==============================] - 1s 8ms/step - loss: 2.7616 - accuracy: 0.4265 - val_loss: 3.0786 - val_accuracy: 0.3275
Epoch 10/200
67/67 [==============================] - 1s 8ms/step - loss: 2.6456 - accuracy: 0.4508 - val_loss: 2.9704 - val_accuracy: 0.3587
Epoch 11/200
67/67 [==============================] - 1s 8ms/step - loss: 2.5644 - accuracy: 0.4734 - val_loss: 2.8454 - val_accuracy: 0.3770
Epoch 12/200
67/67 [==============================] - 1s 8ms/step - loss: 2.4879 - accuracy: 0.4840 - val_loss: 2.7138 - val_accuracy: 0.4089
Epoch 13/200
67/67 [==============================] - 1s 9ms/step - loss: 2.4211 - accuracy: 0.4902 - val_loss: 2.7104 - val_accuracy: 0.4134
Epoch 14/200
67/67 [==============================] - 1s 8ms/step - loss: 2.3922 - accuracy: 0.4883 - val_loss: 2.8825 - val_accuracy: 0.3665
Epoch 15/200
67/67 [==============================] - 1s 8ms/step - loss: 2.3194 - accuracy: 0.5117 - val_loss: 2.8417 - val_accuracy: 0.3743
Epoch 16/200
67/67 [==============================] - 1s 8ms/step - loss: 2.2618 - accuracy: 0.5200 - val_loss: 2.6330 - val_accuracy: 0.4219
Epoch 17/200
67/67 [==============================] - 1s 8ms/step - loss: 2.2111 - accuracy: 0.5257 - val_loss: 2.6277 - val_accuracy: 0.4134
Epoch 18/200
67/67 [==============================] - 1s 8ms/step - loss: 2.1693 - accuracy: 0.5350 - val_loss: 2.5802 - val_accuracy: 0.4232
Epoch 19/200
67/67 [==============================] - 1s 8ms/step - loss: 2.1142 - accuracy: 0.5393 - val_loss: 2.5152 - val_accuracy: 0.4414
Epoch 20/200
67/67 [==============================] - 1s 9ms/step - loss: 2.1002 - accuracy: 0.5427 - val_loss: 2.3551 - val_accuracy: 0.4785
Epoch 21/200
67/67 [==============================] - 1s 8ms/step - loss: 2.0399 - accuracy: 0.5535 - val_loss: 2.4648 - val_accuracy: 0.4473
Epoch 22/200
67/67 [==============================] - 1s 8ms/step - loss: 2.0204 - accuracy: 0.5553 - val_loss: 2.5754 - val_accuracy: 0.4245
Epoch 23/200
67/67 [==============================] - 1s 8ms/step - loss: 1.9651 - accuracy: 0.5674 - val_loss: 2.2756 - val_accuracy: 0.4850
Epoch 24/200
67/67 [==============================] - 1s 8ms/step - loss: 1.9539 - accuracy: 0.5612 - val_loss: 2.1814 - val_accuracy: 0.5072
Epoch 25/200
67/67 [==============================] - 1s 8ms/step - loss: 1.8765 - accuracy: 0.5813 - val_loss: 2.2003 - val_accuracy: 0.4967
Epoch 26/200
67/67 [==============================] - 1s 9ms/step - loss: 1.8403 - accuracy: 0.5868 - val_loss: 2.0699 - val_accuracy: 0.5371
Epoch 27/200
67/67 [==============================] - 1s 8ms/step - loss: 1.8381 - accuracy: 0.5857 - val_loss: 2.4047 - val_accuracy: 0.4479
Epoch 28/200
67/67 [==============================] - 1s 8ms/step - loss: 1.8155 - accuracy: 0.5890 - val_loss: 2.4110 - val_accuracy: 0.4460
Epoch 29/200
67/67 [==============================] - 1s 9ms/step - loss: 1.7808 - accuracy: 0.5985 - val_loss: 2.2027 - val_accuracy: 0.4870
Epoch 30/200
67/67 [==============================] - 1s 8ms/step - loss: 1.7401 - accuracy: 0.6005 - val_loss: 2.0843 - val_accuracy: 0.5137
Epoch 31/200
67/67 [==============================] - 1s 9ms/step - loss: 1.7106 - accuracy: 0.6082 - val_loss: 2.1952 - val_accuracy: 0.4948
Epoch 32/200
67/67 [==============================] - 1s 8ms/step - loss: 1.6828 - accuracy: 0.6081 - val_loss: 2.0262 - val_accuracy: 0.5215
Epoch 33/200
67/67 [==============================] - 1s 9ms/step - loss: 1.6474 - accuracy: 0.6251 - val_loss: 1.8959 - val_accuracy: 0.5560
Epoch 34/200
67/67 [==============================] - 1s 8ms/step - loss: 1.6410 - accuracy: 0.6103 - val_loss: 2.1341 - val_accuracy: 0.4922
Epoch 35/200
67/67 [==============================] - 1s 8ms/step - loss: 1.5855 - accuracy: 0.6310 - val_loss: 1.9497 - val_accuracy: 0.5462
Epoch 36/200
67/67 [==============================] - 1s 8ms/step - loss: 1.5780 - accuracy: 0.6369 - val_loss: 2.1528 - val_accuracy: 0.5013
Epoch 37/200
67/67 [==============================] - 1s 9ms/step - loss: 1.5463 - accuracy: 0.6332 - val_loss: 2.0549 - val_accuracy: 0.5130
Epoch 38/200
67/67 [==============================] - 1s 9ms/step - loss: 1.5439 - accuracy: 0.6271 - val_loss: 1.8822 - val_accuracy: 0.5540
Epoch 39/200
67/67 [==============================] - 1s 9ms/step - loss: 1.4951 - accuracy: 0.6505 - val_loss: 1.9267 - val_accuracy: 0.5378
Epoch 40/200
67/67 [==============================] - 1s 9ms/step - loss: 1.4900 - accuracy: 0.6482 - val_loss: 1.7251 - val_accuracy: 0.5918
Epoch 41/200
67/67 [==============================] - 1s 8ms/step - loss: 1.4670 - accuracy: 0.6591 - val_loss: 1.9841 - val_accuracy: 0.5371
Epoch 42/200
67/67 [==============================] - 1s 9ms/step - loss: 1.4618 - accuracy: 0.6511 - val_loss: 1.8828 - val_accuracy: 0.5521
Epoch 43/200
67/67 [==============================] - 1s 8ms/step - loss: 1.4249 - accuracy: 0.6652 - val_loss: 1.6802 - val_accuracy: 0.5905
Epoch 44/200
67/67 [==============================] - 1s 9ms/step - loss: 1.3813 - accuracy: 0.6682 - val_loss: 1.8416 - val_accuracy: 0.5540
Epoch 45/200
67/67 [==============================] - 1s 9ms/step - loss: 1.4026 - accuracy: 0.6602 - val_loss: 1.7030 - val_accuracy: 0.5801
Epoch 46/200
67/67 [==============================] - 1s 8ms/step - loss: 1.3792 - accuracy: 0.6631 - val_loss: 1.8153 - val_accuracy: 0.5612
Epoch 47/200
67/67 [==============================] - 1s 9ms/step - loss: 1.3837 - accuracy: 0.6716 - val_loss: 1.6790 - val_accuracy: 0.5905
Epoch 48/200
67/67 [==============================] - 1s 8ms/step - loss: 1.3499 - accuracy: 0.6728 - val_loss: 1.7853 - val_accuracy: 0.5618
Epoch 49/200
67/67 [==============================] - 1s 8ms/step - loss: 1.3153 - accuracy: 0.6869 - val_loss: 1.8483 - val_accuracy: 0.5566
Epoch 50/200
67/67 [==============================] - 1s 8ms/step - loss: 1.3142 - accuracy: 0.6804 - val_loss: 1.7340 - val_accuracy: 0.5775
Epoch 51/200
67/67 [==============================] - 1s 8ms/step - loss: 1.2803 - accuracy: 0.6915 - val_loss: 1.8235 - val_accuracy: 0.5579
Epoch 52/200
67/67 [==============================] - 1s 8ms/step - loss: 1.2690 - accuracy: 0.6951 - val_loss: 1.6991 - val_accuracy: 0.5866
Epoch 53/200
67/67 [==============================] - 1s 8ms/step - loss: 1.2644 - accuracy: 0.6971 - val_loss: 1.7354 - val_accuracy: 0.5736
Epoch 54/200
67/67 [==============================] - 1s 8ms/step - loss: 1.2440 - accuracy: 0.6989 - val_loss: 1.6439 - val_accuracy: 0.5924
Epoch 55/200
67/67 [==============================] - 1s 8ms/step - loss: 1.2389 - accuracy: 0.6982 - val_loss: 1.6255 - val_accuracy: 0.5938
Epoch 56/200
67/67 [==============================] - 1s 9ms/step - loss: 1.2163 - accuracy: 0.7101 - val_loss: 1.6328 - val_accuracy: 0.6003
Epoch 57/200
67/67 [==============================] - 1s 8ms/step - loss: 1.2038 - accuracy: 0.7033 - val_loss: 1.7921 - val_accuracy: 0.5632
Epoch 58/200
67/67 [==============================] - 1s 8ms/step - loss: 1.1880 - accuracy: 0.7045 - val_loss: 1.7266 - val_accuracy: 0.5755
Epoch 59/200
67/67 [==============================] - 1s 9ms/step - loss: 1.1644 - accuracy: 0.7135 - val_loss: 1.6156 - val_accuracy: 0.6022
Epoch 60/200
67/67 [==============================] - 1s 9ms/step - loss: 1.1615 - accuracy: 0.7189 - val_loss: 1.4818 - val_accuracy: 0.6302
Epoch 61/200
67/67 [==============================] - 1s 8ms/step - loss: 1.1667 - accuracy: 0.7068 - val_loss: 1.4543 - val_accuracy: 0.6439
Epoch 62/200
67/67 [==============================] - 1s 8ms/step - loss: 1.1439 - accuracy: 0.7105 - val_loss: 1.4678 - val_accuracy: 0.6341
Epoch 63/200
67/67 [==============================] - 1s 9ms/step - loss: 1.1381 - accuracy: 0.7212 - val_loss: 1.4166 - val_accuracy: 0.6465
Epoch 64/200
67/67 [==============================] - 1s 9ms/step - loss: 1.1226 - accuracy: 0.7182 - val_loss: 1.5808 - val_accuracy: 0.6048
Epoch 65/200
67/67 [==============================] - 1s 8ms/step - loss: 1.0845 - accuracy: 0.7303 - val_loss: 1.6037 - val_accuracy: 0.6094
Epoch 66/200
67/67 [==============================] - 1s 9ms/step - loss: 1.0894 - accuracy: 0.7344 - val_loss: 1.6422 - val_accuracy: 0.6055
Epoch 67/200
67/67 [==============================] - 1s 9ms/step - loss: 1.0901 - accuracy: 0.7263 - val_loss: 1.4132 - val_accuracy: 0.6504
Epoch 68/200
67/67 [==============================] - 1s 9ms/step - loss: 1.0747 - accuracy: 0.7373 - val_loss: 1.5460 - val_accuracy: 0.6237
Epoch 69/200
67/67 [==============================] - 1s 8ms/step - loss: 1.0613 - accuracy: 0.7363 - val_loss: 1.5938 - val_accuracy: 0.6042
Epoch 70/200
67/67 [==============================] - 1s 8ms/step - loss: 1.0664 - accuracy: 0.7258 - val_loss: 1.4681 - val_accuracy: 0.6400
Epoch 71/200
67/67 [==============================] - 1s 8ms/step - loss: 1.0440 - accuracy: 0.7377 - val_loss: 1.4681 - val_accuracy: 0.6348
Epoch 72/200
67/67 [==============================] - 1s 8ms/step - loss: 1.0309 - accuracy: 0.7349 - val_loss: 1.4799 - val_accuracy: 0.6348
Epoch 73/200
67/67 [==============================] - 1s 9ms/step - loss: 1.0039 - accuracy: 0.7534 - val_loss: 1.4134 - val_accuracy: 0.6484
Epoch 74/200
67/67 [==============================] - 1s 9ms/step - loss: 1.0130 - accuracy: 0.7387 - val_loss: 1.3500 - val_accuracy: 0.6523
Epoch 75/200
67/67 [==============================] - 1s 9ms/step - loss: 1.0030 - accuracy: 0.7520 - val_loss: 1.5002 - val_accuracy: 0.6400
Epoch 76/200
67/67 [==============================] - 1s 8ms/step - loss: 0.9949 - accuracy: 0.7474 - val_loss: 1.4375 - val_accuracy: 0.6367
Epoch 77/200
67/67 [==============================] - 1s 8ms/step - loss: 0.9892 - accuracy: 0.7516 - val_loss: 1.3291 - val_accuracy: 0.6634
Epoch 78/200
67/67 [==============================] - 1s 8ms/step - loss: 0.9776 - accuracy: 0.7493 - val_loss: 1.4505 - val_accuracy: 0.6419
Epoch 79/200
67/67 [==============================] - 1s 8ms/step - loss: 0.9690 - accuracy: 0.7545 - val_loss: 1.4860 - val_accuracy: 0.6361
Epoch 80/200
67/67 [==============================] - 1s 9ms/step - loss: 0.9534 - accuracy: 0.7605 - val_loss: 1.4432 - val_accuracy: 0.6413
Epoch 81/200
67/67 [==============================] - 1s 8ms/step - loss: 0.9518 - accuracy: 0.7644 - val_loss: 1.4849 - val_accuracy: 0.6237
Epoch 82/200
67/67 [==============================] - 1s 8ms/step - loss: 0.9380 - accuracy: 0.7657 - val_loss: 1.3765 - val_accuracy: 0.6667
Epoch 83/200
67/67 [==============================] - 1s 8ms/step - loss: 0.9308 - accuracy: 0.7623 - val_loss: 1.4336 - val_accuracy: 0.6439
Epoch 84/200
67/67 [==============================] - 1s 8ms/step - loss: 0.9155 - accuracy: 0.7683 - val_loss: 1.3044 - val_accuracy: 0.6693
Epoch 85/200
67/67 [==============================] - 1s 8ms/step - loss: 0.9277 - accuracy: 0.7605 - val_loss: 1.2979 - val_accuracy: 0.6751
Epoch 86/200
67/67 [==============================] - 1s 9ms/step - loss: 0.8883 - accuracy: 0.7789 - val_loss: 1.4142 - val_accuracy: 0.6465
Epoch 87/200
67/67 [==============================] - 1s 8ms/step - loss: 0.9162 - accuracy: 0.7634 - val_loss: 1.2903 - val_accuracy: 0.6660
Epoch 88/200
67/67 [==============================] - 1s 9ms/step - loss: 0.9049 - accuracy: 0.7762 - val_loss: 1.4433 - val_accuracy: 0.6374
Epoch 89/200
67/67 [==============================] - 1s 8ms/step - loss: 0.8801 - accuracy: 0.7740 - val_loss: 1.2860 - val_accuracy: 0.6732
Epoch 90/200
67/67 [==============================] - 1s 8ms/step - loss: 0.9014 - accuracy: 0.7688 - val_loss: 1.3829 - val_accuracy: 0.6510
Epoch 91/200
67/67 [==============================] - 1s 8ms/step - loss: 0.8396 - accuracy: 0.7902 - val_loss: 1.4369 - val_accuracy: 0.6523
Epoch 92/200
67/67 [==============================] - 1s 8ms/step - loss: 0.8848 - accuracy: 0.7673 - val_loss: 1.3454 - val_accuracy: 0.6608
Epoch 93/200
67/67 [==============================] - 1s 9ms/step - loss: 0.8438 - accuracy: 0.7852 - val_loss: 1.3003 - val_accuracy: 0.6706
Epoch 94/200
67/67 [==============================] - 1s 9ms/step - loss: 0.8214 - accuracy: 0.7986 - val_loss: 1.3442 - val_accuracy: 0.6569
Epoch 95/200
67/67 [==============================] - 1s 8ms/step - loss: 0.8598 - accuracy: 0.7785 - val_loss: 1.2514 - val_accuracy: 0.6758
Epoch 96/200
67/67 [==============================] - 1s 8ms/step - loss: 0.8464 - accuracy: 0.7797 - val_loss: 1.3252 - val_accuracy: 0.6719
Epoch 97/200
67/67 [==============================] - 1s 8ms/step - loss: 0.8433 - accuracy: 0.7881 - val_loss: 1.2900 - val_accuracy: 0.6810
Epoch 98/200
67/67 [==============================] - 1s 8ms/step - loss: 0.8246 - accuracy: 0.7919 - val_loss: 1.2542 - val_accuracy: 0.6745
Epoch 99/200
67/67 [==============================] - 1s 9ms/step - loss: 0.8063 - accuracy: 0.7946 - val_loss: 1.2841 - val_accuracy: 0.6777
Epoch 100/200
67/67 [==============================] - 1s 8ms/step - loss: 0.8344 - accuracy: 0.7897 - val_loss: 1.3990 - val_accuracy: 0.6556
Epoch 101/200
67/67 [==============================] - 1s 8ms/step - loss: 0.8007 - accuracy: 0.7932 - val_loss: 1.3730 - val_accuracy: 0.6719
Epoch 102/200
67/67 [==============================] - 1s 9ms/step - loss: 0.7693 - accuracy: 0.8080 - val_loss: 1.2318 - val_accuracy: 0.6816
Epoch 103/200
67/67 [==============================] - 1s 8ms/step - loss: 0.7640 - accuracy: 0.8110 - val_loss: 1.2689 - val_accuracy: 0.6784
Epoch 104/200
67/67 [==============================] - 1s 8ms/step - loss: 0.7699 - accuracy: 0.8065 - val_loss: 1.3875 - val_accuracy: 0.6562
Epoch 105/200
67/67 [==============================] - 1s 8ms/step - loss: 0.7560 - accuracy: 0.8097 - val_loss: 1.2805 - val_accuracy: 0.6823
Epoch 106/200
67/67 [==============================] - 1s 9ms/step - loss: 0.7715 - accuracy: 0.7977 - val_loss: 1.2808 - val_accuracy: 0.6823
Epoch 107/200
67/67 [==============================] - 1s 9ms/step - loss: 0.7355 - accuracy: 0.8102 - val_loss: 1.2836 - val_accuracy: 0.6829
Epoch 108/200
67/67 [==============================] - 1s 9ms/step - loss: 0.7600 - accuracy: 0.8079 - val_loss: 1.3467 - val_accuracy: 0.6641
Epoch 109/200
67/67 [==============================] - 1s 9ms/step - loss: 0.7472 - accuracy: 0.8056 - val_loss: 1.2971 - val_accuracy: 0.6758
Epoch 110/200
67/67 [==============================] - 1s 8ms/step - loss: 0.7402 - accuracy: 0.8126 - val_loss: 1.3220 - val_accuracy: 0.6758
Epoch 111/200
67/67 [==============================] - 1s 9ms/step - loss: 0.7418 - accuracy: 0.8043 - val_loss: 1.3363 - val_accuracy: 0.6738
Epoch 112/200
67/67 [==============================] - 1s 9ms/step - loss: 0.7262 - accuracy: 0.8147 - val_loss: 1.3467 - val_accuracy: 0.6686
Epoch 113/200
67/67 [==============================] - 1s 8ms/step - loss: 0.6884 - accuracy: 0.8293 - val_loss: 1.2325 - val_accuracy: 0.6829
Epoch 114/200
67/67 [==============================] - 1s 9ms/step - loss: 0.7140 - accuracy: 0.8204 - val_loss: 1.2358 - val_accuracy: 0.6823
Epoch 115/200
67/67 [==============================] - 1s 8ms/step - loss: 0.7155 - accuracy: 0.8195 - val_loss: 1.2523 - val_accuracy: 0.6849
Epoch 116/200
67/67 [==============================] - 1s 8ms/step - loss: 0.7166 - accuracy: 0.8217 - val_loss: 1.3509 - val_accuracy: 0.6680
Epoch 117/200
67/67 [==============================] - 1s 8ms/step - loss: 0.7010 - accuracy: 0.8225 - val_loss: 1.1961 - val_accuracy: 0.6999
Epoch 118/200
67/67 [==============================] - 1s 9ms/step - loss: 0.6842 - accuracy: 0.8259 - val_loss: 1.2996 - val_accuracy: 0.6816
Epoch 119/200
67/67 [==============================] - 1s 8ms/step - loss: 0.6804 - accuracy: 0.8255 - val_loss: 1.3544 - val_accuracy: 0.6693
Epoch 120/200
67/67 [==============================] - 1s 9ms/step - loss: 0.6958 - accuracy: 0.8319 - val_loss: 1.2487 - val_accuracy: 0.6751
Epoch 121/200
67/67 [==============================] - 1s 9ms/step - loss: 0.7026 - accuracy: 0.8160 - val_loss: 1.2773 - val_accuracy: 0.6816
Epoch 122/200
67/67 [==============================] - 1s 8ms/step - loss: 0.6723 - accuracy: 0.8323 - val_loss: 1.2442 - val_accuracy: 0.6953
Epoch 123/200
67/67 [==============================] - 1s 9ms/step - loss: 0.6792 - accuracy: 0.8279 - val_loss: 1.2515 - val_accuracy: 0.6823
Epoch 124/200
67/67 [==============================] - 1s 9ms/step - loss: 0.6604 - accuracy: 0.8293 - val_loss: 1.3520 - val_accuracy: 0.6693
Epoch 125/200
67/67 [==============================] - 1s 9ms/step - loss: 0.6477 - accuracy: 0.8315 - val_loss: 1.2317 - val_accuracy: 0.6921
Epoch 126/200
67/67 [==============================] - 1s 8ms/step - loss: 0.6465 - accuracy: 0.8379 - val_loss: 1.1972 - val_accuracy: 0.6986
Epoch 127/200
67/67 [==============================] - 1s 9ms/step - loss: 0.6534 - accuracy: 0.8322 - val_loss: 1.2637 - val_accuracy: 0.6849
Epoch 128/200
67/67 [==============================] - 1s 9ms/step - loss: 0.6482 - accuracy: 0.8308 - val_loss: 1.3467 - val_accuracy: 0.6706
Epoch 129/200
67/67 [==============================] - 1s 9ms/step - loss: 0.6402 - accuracy: 0.8420 - val_loss: 1.2771 - val_accuracy: 0.6934
Epoch 130/200
67/67 [==============================] - 1s 9ms/step - loss: 0.6357 - accuracy: 0.8374 - val_loss: 1.2092 - val_accuracy: 0.6992
Epoch 131/200
67/67 [==============================] - 1s 8ms/step - loss: 0.6389 - accuracy: 0.8344 - val_loss: 1.2209 - val_accuracy: 0.6934
Epoch 132/200
67/67 [==============================] - 1s 9ms/step - loss: 0.6295 - accuracy: 0.8431 - val_loss: 1.3026 - val_accuracy: 0.6895
Epoch 133/200
67/67 [==============================] - 1s 8ms/step - loss: 0.6459 - accuracy: 0.8363 - val_loss: 1.3319 - val_accuracy: 0.6706
Epoch 134/200
67/67 [==============================] - 1s 9ms/step - loss: 0.6350 - accuracy: 0.8383 - val_loss: 1.2771 - val_accuracy: 0.6842
Epoch 135/200
67/67 [==============================] - 1s 9ms/step - loss: 0.6023 - accuracy: 0.8500 - val_loss: 1.2679 - val_accuracy: 0.6803
Epoch 136/200
67/67 [==============================] - 1s 9ms/step - loss: 0.6047 - accuracy: 0.8489 - val_loss: 1.2512 - val_accuracy: 0.6966
Epoch 137/200
67/67 [==============================] - 1s 9ms/step - loss: 0.6188 - accuracy: 0.8383 - val_loss: 1.3416 - val_accuracy: 0.6784
In [ ]:
_, accuracy = model_report(CNN1_MODEL_OPTIMIZED, CNN1_MODEL_OPTIMIZED_history)
accuracies_opt_128["CNN1"] = accuracy
Test set evaluation metrics
---------------------------
Loss:     1.161
Accuracy: 71.387%
CNN2
In [ ]:
CNN2_MODEL_OPTIMIZED = init_cnn2_model_optimized(summary = True)
CNN2_MODEL_OPTIMIZED_history = train_model(CNN2_MODEL_OPTIMIZED, epochs = 200, callbacks=[callback])
Model: "sequential_8"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
conv2d_16 (Conv2D)           (None, 32, 32, 32)        896       
_________________________________________________________________
batch_normalization_16 (Batc (None, 32, 32, 32)        128       
_________________________________________________________________
re_lu_16 (ReLU)              (None, 32, 32, 32)        0         
_________________________________________________________________
max_pooling2d_11 (MaxPooling (None, 16, 16, 32)        0         
_________________________________________________________________
dropout_22 (Dropout)         (None, 16, 16, 32)        0         
_________________________________________________________________
conv2d_17 (Conv2D)           (None, 16, 16, 64)        18496     
_________________________________________________________________
batch_normalization_17 (Batc (None, 16, 16, 64)        256       
_________________________________________________________________
re_lu_17 (ReLU)              (None, 16, 16, 64)        0         
_________________________________________________________________
max_pooling2d_12 (MaxPooling (None, 8, 8, 64)          0         
_________________________________________________________________
dropout_23 (Dropout)         (None, 8, 8, 64)          0         
_________________________________________________________________
conv2d_18 (Conv2D)           (None, 8, 8, 128)         73856     
_________________________________________________________________
batch_normalization_18 (Batc (None, 8, 8, 128)         512       
_________________________________________________________________
re_lu_18 (ReLU)              (None, 8, 8, 128)         0         
_________________________________________________________________
max_pooling2d_13 (MaxPooling (None, 4, 4, 128)         0         
_________________________________________________________________
dropout_24 (Dropout)         (None, 4, 4, 128)         0         
_________________________________________________________________
conv2d_19 (Conv2D)           (None, 4, 4, 256)         295168    
_________________________________________________________________
batch_normalization_19 (Batc (None, 4, 4, 256)         1024      
_________________________________________________________________
re_lu_19 (ReLU)              (None, 4, 4, 256)         0         
_________________________________________________________________
dropout_25 (Dropout)         (None, 4, 4, 256)         0         
_________________________________________________________________
flatten_5 (Flatten)          (None, 4096)              0         
_________________________________________________________________
dense_13 (Dense)             (None, 512)               2097664   
_________________________________________________________________
dropout_26 (Dropout)         (None, 512)               0         
_________________________________________________________________
dense_14 (Dense)             (None, 20)                10260     
=================================================================
Total params: 2,498,260
Trainable params: 2,497,300
Non-trainable params: 960
_________________________________________________________________
Epoch 1/200
67/67 [==============================] - 2s 13ms/step - loss: 6.1454 - accuracy: 0.1017 - val_loss: 6.0266 - val_accuracy: 0.0514
Epoch 2/200
67/67 [==============================] - 1s 10ms/step - loss: 5.5258 - accuracy: 0.1984 - val_loss: 6.2619 - val_accuracy: 0.0579
Epoch 3/200
67/67 [==============================] - 1s 10ms/step - loss: 5.2745 - accuracy: 0.2389 - val_loss: 6.4412 - val_accuracy: 0.0501
Epoch 4/200
67/67 [==============================] - 1s 10ms/step - loss: 5.0899 - accuracy: 0.2650 - val_loss: 6.5346 - val_accuracy: 0.0495
Epoch 5/200
67/67 [==============================] - 1s 10ms/step - loss: 4.8701 - accuracy: 0.3044 - val_loss: 6.4075 - val_accuracy: 0.0651
Epoch 6/200
67/67 [==============================] - 1s 11ms/step - loss: 4.6613 - accuracy: 0.3290 - val_loss: 6.1125 - val_accuracy: 0.0990
Epoch 7/200
67/67 [==============================] - 1s 10ms/step - loss: 4.4841 - accuracy: 0.3606 - val_loss: 5.6500 - val_accuracy: 0.1172
Epoch 8/200
67/67 [==============================] - 1s 10ms/step - loss: 4.3250 - accuracy: 0.3768 - val_loss: 5.1102 - val_accuracy: 0.1823
Epoch 9/200
67/67 [==============================] - 1s 10ms/step - loss: 4.1651 - accuracy: 0.3969 - val_loss: 4.7274 - val_accuracy: 0.2467
Epoch 10/200
67/67 [==============================] - 1s 11ms/step - loss: 4.0175 - accuracy: 0.4097 - val_loss: 4.5297 - val_accuracy: 0.2760
Epoch 11/200
67/67 [==============================] - 1s 10ms/step - loss: 3.8935 - accuracy: 0.4277 - val_loss: 4.3156 - val_accuracy: 0.3132
Epoch 12/200
67/67 [==============================] - 1s 10ms/step - loss: 3.7320 - accuracy: 0.4428 - val_loss: 4.1389 - val_accuracy: 0.3333
Epoch 13/200
67/67 [==============================] - 1s 10ms/step - loss: 3.6394 - accuracy: 0.4555 - val_loss: 4.1607 - val_accuracy: 0.3314
Epoch 14/200
67/67 [==============================] - 1s 10ms/step - loss: 3.5137 - accuracy: 0.4650 - val_loss: 4.1925 - val_accuracy: 0.3307
Epoch 15/200
67/67 [==============================] - 1s 10ms/step - loss: 3.3881 - accuracy: 0.4943 - val_loss: 4.4251 - val_accuracy: 0.2910
Epoch 16/200
67/67 [==============================] - 1s 11ms/step - loss: 3.3022 - accuracy: 0.4903 - val_loss: 4.0220 - val_accuracy: 0.3535
Epoch 17/200
67/67 [==============================] - 1s 10ms/step - loss: 3.2151 - accuracy: 0.5138 - val_loss: 4.3581 - val_accuracy: 0.2891
Epoch 18/200
67/67 [==============================] - 1s 10ms/step - loss: 3.0671 - accuracy: 0.5264 - val_loss: 3.6988 - val_accuracy: 0.3776
Epoch 19/200
67/67 [==============================] - 1s 10ms/step - loss: 2.9935 - accuracy: 0.5327 - val_loss: 3.8963 - val_accuracy: 0.3503
Epoch 20/200
67/67 [==============================] - 1s 10ms/step - loss: 2.9117 - accuracy: 0.5461 - val_loss: 3.7694 - val_accuracy: 0.3639
Epoch 21/200
67/67 [==============================] - 1s 10ms/step - loss: 2.8220 - accuracy: 0.5486 - val_loss: 3.6091 - val_accuracy: 0.3913
Epoch 22/200
67/67 [==============================] - 1s 10ms/step - loss: 2.7469 - accuracy: 0.5648 - val_loss: 3.6908 - val_accuracy: 0.3691
Epoch 23/200
67/67 [==============================] - 1s 10ms/step - loss: 2.6856 - accuracy: 0.5694 - val_loss: 3.3423 - val_accuracy: 0.4108
Epoch 24/200
67/67 [==============================] - 1s 10ms/step - loss: 2.5845 - accuracy: 0.5715 - val_loss: 3.2261 - val_accuracy: 0.4284
Epoch 25/200
67/67 [==============================] - 1s 10ms/step - loss: 2.5335 - accuracy: 0.5824 - val_loss: 3.7535 - val_accuracy: 0.3268
Epoch 26/200
67/67 [==============================] - 1s 10ms/step - loss: 2.4326 - accuracy: 0.6016 - val_loss: 3.3571 - val_accuracy: 0.3867
Epoch 27/200
67/67 [==============================] - 1s 11ms/step - loss: 2.3609 - accuracy: 0.6140 - val_loss: 3.1292 - val_accuracy: 0.4251
Epoch 28/200
67/67 [==============================] - 1s 10ms/step - loss: 2.3177 - accuracy: 0.6079 - val_loss: 3.1453 - val_accuracy: 0.4290
Epoch 29/200
67/67 [==============================] - 1s 10ms/step - loss: 2.2512 - accuracy: 0.6199 - val_loss: 3.0614 - val_accuracy: 0.4382
Epoch 30/200
67/67 [==============================] - 1s 11ms/step - loss: 2.1992 - accuracy: 0.6211 - val_loss: 2.9685 - val_accuracy: 0.4427
Epoch 31/200
67/67 [==============================] - 1s 11ms/step - loss: 2.1389 - accuracy: 0.6349 - val_loss: 2.8858 - val_accuracy: 0.4648
Epoch 32/200
67/67 [==============================] - 1s 11ms/step - loss: 2.0714 - accuracy: 0.6508 - val_loss: 3.0433 - val_accuracy: 0.4225
Epoch 33/200
67/67 [==============================] - 1s 11ms/step - loss: 2.0283 - accuracy: 0.6467 - val_loss: 2.8874 - val_accuracy: 0.4557
Epoch 34/200
67/67 [==============================] - 1s 11ms/step - loss: 1.9761 - accuracy: 0.6598 - val_loss: 2.5960 - val_accuracy: 0.5176
Epoch 35/200
67/67 [==============================] - 1s 11ms/step - loss: 1.9017 - accuracy: 0.6666 - val_loss: 2.8394 - val_accuracy: 0.4531
Epoch 36/200
67/67 [==============================] - 1s 10ms/step - loss: 1.8874 - accuracy: 0.6724 - val_loss: 2.6865 - val_accuracy: 0.4935
Epoch 37/200
67/67 [==============================] - 1s 11ms/step - loss: 1.8025 - accuracy: 0.6826 - val_loss: 2.7750 - val_accuracy: 0.4557
Epoch 38/200
67/67 [==============================] - 1s 10ms/step - loss: 1.7863 - accuracy: 0.6842 - val_loss: 2.4701 - val_accuracy: 0.5137
Epoch 39/200
67/67 [==============================] - 1s 11ms/step - loss: 1.7584 - accuracy: 0.6852 - val_loss: 2.5360 - val_accuracy: 0.5091
Epoch 40/200
67/67 [==============================] - 1s 11ms/step - loss: 1.7031 - accuracy: 0.7012 - val_loss: 2.4265 - val_accuracy: 0.5260
Epoch 41/200
67/67 [==============================] - 1s 11ms/step - loss: 1.6541 - accuracy: 0.7084 - val_loss: 2.3696 - val_accuracy: 0.5371
Epoch 42/200
67/67 [==============================] - 1s 10ms/step - loss: 1.6348 - accuracy: 0.7066 - val_loss: 2.5241 - val_accuracy: 0.4980
Epoch 43/200
67/67 [==============================] - 1s 11ms/step - loss: 1.5758 - accuracy: 0.7250 - val_loss: 2.4359 - val_accuracy: 0.5182
Epoch 44/200
67/67 [==============================] - 1s 10ms/step - loss: 1.5761 - accuracy: 0.7094 - val_loss: 2.7073 - val_accuracy: 0.4564
Epoch 45/200
67/67 [==============================] - 1s 10ms/step - loss: 1.4881 - accuracy: 0.7301 - val_loss: 2.5056 - val_accuracy: 0.4980
Epoch 46/200
67/67 [==============================] - 1s 10ms/step - loss: 1.4828 - accuracy: 0.7338 - val_loss: 2.4148 - val_accuracy: 0.5215
Epoch 47/200
67/67 [==============================] - 1s 11ms/step - loss: 1.4276 - accuracy: 0.7389 - val_loss: 2.1809 - val_accuracy: 0.5625
Epoch 48/200
67/67 [==============================] - 1s 10ms/step - loss: 1.4220 - accuracy: 0.7432 - val_loss: 2.4938 - val_accuracy: 0.4967
Epoch 49/200
67/67 [==============================] - 1s 11ms/step - loss: 1.3597 - accuracy: 0.7532 - val_loss: 2.5239 - val_accuracy: 0.4896
Epoch 50/200
67/67 [==============================] - 1s 10ms/step - loss: 1.3479 - accuracy: 0.7545 - val_loss: 2.1831 - val_accuracy: 0.5488
Epoch 51/200
67/67 [==============================] - 1s 11ms/step - loss: 1.3156 - accuracy: 0.7560 - val_loss: 2.2193 - val_accuracy: 0.5449
Epoch 52/200
67/67 [==============================] - 1s 11ms/step - loss: 1.2661 - accuracy: 0.7712 - val_loss: 2.1496 - val_accuracy: 0.5599
Epoch 53/200
67/67 [==============================] - 1s 10ms/step - loss: 1.2320 - accuracy: 0.7796 - val_loss: 2.1936 - val_accuracy: 0.5469
Epoch 54/200
67/67 [==============================] - 1s 11ms/step - loss: 1.2287 - accuracy: 0.7719 - val_loss: 2.1294 - val_accuracy: 0.5618
Epoch 55/200
67/67 [==============================] - 1s 11ms/step - loss: 1.1978 - accuracy: 0.7780 - val_loss: 2.1264 - val_accuracy: 0.5723
Epoch 56/200
67/67 [==============================] - 1s 10ms/step - loss: 1.1794 - accuracy: 0.7862 - val_loss: 2.0759 - val_accuracy: 0.5794
Epoch 57/200
67/67 [==============================] - 1s 10ms/step - loss: 1.1462 - accuracy: 0.7887 - val_loss: 1.9002 - val_accuracy: 0.6055
Epoch 58/200
67/67 [==============================] - 1s 10ms/step - loss: 1.1336 - accuracy: 0.7923 - val_loss: 1.9820 - val_accuracy: 0.5924
Epoch 59/200
67/67 [==============================] - 1s 10ms/step - loss: 1.0889 - accuracy: 0.8080 - val_loss: 1.9390 - val_accuracy: 0.6022
Epoch 60/200
67/67 [==============================] - 1s 11ms/step - loss: 1.0618 - accuracy: 0.8011 - val_loss: 1.9714 - val_accuracy: 0.5970
Epoch 61/200
67/67 [==============================] - 1s 10ms/step - loss: 1.0610 - accuracy: 0.8077 - val_loss: 2.1131 - val_accuracy: 0.5645
Epoch 62/200
67/67 [==============================] - 1s 10ms/step - loss: 1.0310 - accuracy: 0.8100 - val_loss: 1.9369 - val_accuracy: 0.5859
Epoch 63/200
67/67 [==============================] - 1s 10ms/step - loss: 1.0059 - accuracy: 0.8116 - val_loss: 1.9312 - val_accuracy: 0.5938
Epoch 64/200
67/67 [==============================] - 1s 10ms/step - loss: 0.9813 - accuracy: 0.8241 - val_loss: 1.8365 - val_accuracy: 0.6133
Epoch 65/200
67/67 [==============================] - 1s 10ms/step - loss: 0.9458 - accuracy: 0.8256 - val_loss: 1.8595 - val_accuracy: 0.6178
Epoch 66/200
67/67 [==============================] - 1s 11ms/step - loss: 0.9380 - accuracy: 0.8267 - val_loss: 1.8039 - val_accuracy: 0.6276
Epoch 67/200
67/67 [==============================] - 1s 10ms/step - loss: 0.9246 - accuracy: 0.8352 - val_loss: 1.9159 - val_accuracy: 0.6087
Epoch 68/200
67/67 [==============================] - 1s 10ms/step - loss: 0.8875 - accuracy: 0.8412 - val_loss: 1.8300 - val_accuracy: 0.6061
Epoch 69/200
67/67 [==============================] - 1s 10ms/step - loss: 0.8661 - accuracy: 0.8521 - val_loss: 1.8308 - val_accuracy: 0.6172
Epoch 70/200
67/67 [==============================] - 1s 11ms/step - loss: 0.8489 - accuracy: 0.8514 - val_loss: 1.8985 - val_accuracy: 0.6048
Epoch 71/200
67/67 [==============================] - 1s 10ms/step - loss: 0.8746 - accuracy: 0.8366 - val_loss: 1.8224 - val_accuracy: 0.6165
Epoch 72/200
67/67 [==============================] - 1s 10ms/step - loss: 0.8065 - accuracy: 0.8608 - val_loss: 1.7259 - val_accuracy: 0.6257
Epoch 73/200
67/67 [==============================] - 1s 10ms/step - loss: 0.8127 - accuracy: 0.8551 - val_loss: 1.9286 - val_accuracy: 0.6048
Epoch 74/200
67/67 [==============================] - 1s 10ms/step - loss: 0.7993 - accuracy: 0.8571 - val_loss: 1.7260 - val_accuracy: 0.6309
Epoch 75/200
67/67 [==============================] - 1s 11ms/step - loss: 0.7876 - accuracy: 0.8569 - val_loss: 1.8824 - val_accuracy: 0.6146
Epoch 76/200
67/67 [==============================] - 1s 10ms/step - loss: 0.7696 - accuracy: 0.8613 - val_loss: 1.8268 - val_accuracy: 0.6094
Epoch 77/200
67/67 [==============================] - 1s 11ms/step - loss: 0.7546 - accuracy: 0.8660 - val_loss: 1.7448 - val_accuracy: 0.6211
Epoch 78/200
67/67 [==============================] - 1s 11ms/step - loss: 0.7356 - accuracy: 0.8663 - val_loss: 1.7842 - val_accuracy: 0.6204
Epoch 79/200
67/67 [==============================] - 1s 11ms/step - loss: 0.7273 - accuracy: 0.8709 - val_loss: 1.5534 - val_accuracy: 0.6523
Epoch 80/200
67/67 [==============================] - 1s 11ms/step - loss: 0.6955 - accuracy: 0.8762 - val_loss: 1.9772 - val_accuracy: 0.6042
Epoch 81/200
67/67 [==============================] - 1s 11ms/step - loss: 0.7105 - accuracy: 0.8834 - val_loss: 1.6532 - val_accuracy: 0.6452
Epoch 82/200
67/67 [==============================] - 1s 11ms/step - loss: 0.6759 - accuracy: 0.8845 - val_loss: 1.6993 - val_accuracy: 0.6270
Epoch 83/200
67/67 [==============================] - 1s 11ms/step - loss: 0.6786 - accuracy: 0.8798 - val_loss: 1.7178 - val_accuracy: 0.6393
Epoch 84/200
67/67 [==============================] - 1s 10ms/step - loss: 0.6560 - accuracy: 0.8866 - val_loss: 1.7413 - val_accuracy: 0.6413
Epoch 85/200
67/67 [==============================] - 1s 11ms/step - loss: 0.6442 - accuracy: 0.8863 - val_loss: 1.5866 - val_accuracy: 0.6608
Epoch 86/200
67/67 [==============================] - 1s 11ms/step - loss: 0.6538 - accuracy: 0.8858 - val_loss: 1.6143 - val_accuracy: 0.6439
Epoch 87/200
67/67 [==============================] - 1s 10ms/step - loss: 0.6126 - accuracy: 0.8986 - val_loss: 1.8732 - val_accuracy: 0.6165
Epoch 88/200
67/67 [==============================] - 1s 10ms/step - loss: 0.6246 - accuracy: 0.8908 - val_loss: 1.7589 - val_accuracy: 0.6374
Epoch 89/200
67/67 [==============================] - 1s 11ms/step - loss: 0.5976 - accuracy: 0.9009 - val_loss: 1.7329 - val_accuracy: 0.6452
Epoch 90/200
67/67 [==============================] - 1s 10ms/step - loss: 0.5929 - accuracy: 0.8988 - val_loss: 1.6179 - val_accuracy: 0.6549
Epoch 91/200
67/67 [==============================] - 1s 10ms/step - loss: 0.5632 - accuracy: 0.9085 - val_loss: 1.8443 - val_accuracy: 0.6198
Epoch 92/200
67/67 [==============================] - 1s 11ms/step - loss: 0.5732 - accuracy: 0.9017 - val_loss: 1.6148 - val_accuracy: 0.6458
Epoch 93/200
67/67 [==============================] - 1s 10ms/step - loss: 0.5559 - accuracy: 0.9073 - val_loss: 1.7402 - val_accuracy: 0.6302
Epoch 94/200
67/67 [==============================] - 1s 10ms/step - loss: 0.5429 - accuracy: 0.9163 - val_loss: 1.7063 - val_accuracy: 0.6413
Epoch 95/200
67/67 [==============================] - 1s 11ms/step - loss: 0.5585 - accuracy: 0.9078 - val_loss: 1.4946 - val_accuracy: 0.6797
Epoch 96/200
67/67 [==============================] - 1s 11ms/step - loss: 0.5587 - accuracy: 0.9031 - val_loss: 1.5607 - val_accuracy: 0.6589
Epoch 97/200
67/67 [==============================] - 1s 11ms/step - loss: 0.5142 - accuracy: 0.9226 - val_loss: 1.6163 - val_accuracy: 0.6556
Epoch 98/200
67/67 [==============================] - 1s 11ms/step - loss: 0.5145 - accuracy: 0.9207 - val_loss: 1.8373 - val_accuracy: 0.6309
Epoch 99/200
67/67 [==============================] - 1s 11ms/step - loss: 0.5061 - accuracy: 0.9171 - val_loss: 1.5403 - val_accuracy: 0.6738
Epoch 100/200
67/67 [==============================] - 1s 11ms/step - loss: 0.5208 - accuracy: 0.9146 - val_loss: 1.8284 - val_accuracy: 0.6302
Epoch 101/200
67/67 [==============================] - 1s 10ms/step - loss: 0.5063 - accuracy: 0.9141 - val_loss: 1.6571 - val_accuracy: 0.6400
Epoch 102/200
67/67 [==============================] - 1s 10ms/step - loss: 0.4768 - accuracy: 0.9240 - val_loss: 1.5734 - val_accuracy: 0.6543
Epoch 103/200
67/67 [==============================] - 1s 11ms/step - loss: 0.4797 - accuracy: 0.9213 - val_loss: 1.5081 - val_accuracy: 0.6751
Epoch 104/200
67/67 [==============================] - 1s 11ms/step - loss: 0.4759 - accuracy: 0.9271 - val_loss: 1.6406 - val_accuracy: 0.6549
Epoch 105/200
67/67 [==============================] - 1s 11ms/step - loss: 0.4724 - accuracy: 0.9250 - val_loss: 1.6556 - val_accuracy: 0.6543
Epoch 106/200
67/67 [==============================] - 1s 10ms/step - loss: 0.4544 - accuracy: 0.9321 - val_loss: 1.6622 - val_accuracy: 0.6452
Epoch 107/200
67/67 [==============================] - 1s 10ms/step - loss: 0.4579 - accuracy: 0.9273 - val_loss: 1.5217 - val_accuracy: 0.6745
Epoch 108/200
67/67 [==============================] - 1s 10ms/step - loss: 0.4387 - accuracy: 0.9344 - val_loss: 1.6611 - val_accuracy: 0.6576
Epoch 109/200
67/67 [==============================] - 1s 11ms/step - loss: 0.4407 - accuracy: 0.9310 - val_loss: 1.4930 - val_accuracy: 0.6751
Epoch 110/200
67/67 [==============================] - 1s 11ms/step - loss: 0.4318 - accuracy: 0.9366 - val_loss: 1.5698 - val_accuracy: 0.6660
Epoch 111/200
67/67 [==============================] - 1s 10ms/step - loss: 0.4146 - accuracy: 0.9402 - val_loss: 1.6366 - val_accuracy: 0.6562
Epoch 112/200
67/67 [==============================] - 1s 11ms/step - loss: 0.4115 - accuracy: 0.9379 - val_loss: 1.4770 - val_accuracy: 0.6940
Epoch 113/200
67/67 [==============================] - 1s 10ms/step - loss: 0.4132 - accuracy: 0.9357 - val_loss: 1.5196 - val_accuracy: 0.6758
Epoch 114/200
67/67 [==============================] - 1s 11ms/step - loss: 0.4024 - accuracy: 0.9428 - val_loss: 1.6247 - val_accuracy: 0.6693
Epoch 115/200
67/67 [==============================] - 1s 10ms/step - loss: 0.4081 - accuracy: 0.9406 - val_loss: 1.5520 - val_accuracy: 0.6654
Epoch 116/200
67/67 [==============================] - 1s 11ms/step - loss: 0.4119 - accuracy: 0.9344 - val_loss: 1.5517 - val_accuracy: 0.6764
Epoch 117/200
67/67 [==============================] - 1s 11ms/step - loss: 0.3941 - accuracy: 0.9410 - val_loss: 1.7752 - val_accuracy: 0.6354
Epoch 118/200
67/67 [==============================] - 1s 11ms/step - loss: 0.4029 - accuracy: 0.9391 - val_loss: 1.6808 - val_accuracy: 0.6595
Epoch 119/200
67/67 [==============================] - 1s 11ms/step - loss: 0.3887 - accuracy: 0.9410 - val_loss: 1.6866 - val_accuracy: 0.6562
Epoch 120/200
67/67 [==============================] - 1s 11ms/step - loss: 0.3811 - accuracy: 0.9447 - val_loss: 1.6043 - val_accuracy: 0.6582
Epoch 121/200
67/67 [==============================] - 1s 11ms/step - loss: 0.3924 - accuracy: 0.9364 - val_loss: 1.5814 - val_accuracy: 0.6660
Epoch 122/200
67/67 [==============================] - 1s 10ms/step - loss: 0.3855 - accuracy: 0.9399 - val_loss: 1.5365 - val_accuracy: 0.6738
Epoch 123/200
67/67 [==============================] - 1s 11ms/step - loss: 0.3635 - accuracy: 0.9436 - val_loss: 1.5920 - val_accuracy: 0.6680
Epoch 124/200
67/67 [==============================] - 1s 11ms/step - loss: 0.3672 - accuracy: 0.9479 - val_loss: 1.5161 - val_accuracy: 0.6667
Epoch 125/200
67/67 [==============================] - 1s 10ms/step - loss: 0.3685 - accuracy: 0.9476 - val_loss: 1.8121 - val_accuracy: 0.6419
Epoch 126/200
67/67 [==============================] - 1s 11ms/step - loss: 0.3605 - accuracy: 0.9441 - val_loss: 1.6303 - val_accuracy: 0.6673
Epoch 127/200
67/67 [==============================] - 1s 11ms/step - loss: 0.3748 - accuracy: 0.9398 - val_loss: 1.5843 - val_accuracy: 0.6719
Epoch 128/200
67/67 [==============================] - 1s 11ms/step - loss: 0.3467 - accuracy: 0.9496 - val_loss: 1.6112 - val_accuracy: 0.6654
Epoch 129/200
67/67 [==============================] - 1s 10ms/step - loss: 0.3510 - accuracy: 0.9486 - val_loss: 1.6819 - val_accuracy: 0.6452
Epoch 130/200
67/67 [==============================] - 1s 11ms/step - loss: 0.3503 - accuracy: 0.9479 - val_loss: 1.5654 - val_accuracy: 0.6712
Epoch 131/200
67/67 [==============================] - 1s 11ms/step - loss: 0.3478 - accuracy: 0.9468 - val_loss: 1.6275 - val_accuracy: 0.6660
Epoch 132/200
67/67 [==============================] - 1s 10ms/step - loss: 0.3301 - accuracy: 0.9521 - val_loss: 1.6766 - val_accuracy: 0.6602
In [ ]:
_, accuracy = model_report(CNN2_MODEL_OPTIMIZED, CNN2_MODEL_OPTIMIZED_history)
accuracies_opt_128["CNN2"] = accuracy
Test set evaluation metrics
---------------------------
Loss:     1.492
Accuracy: 67.725%

Μεταφορά μάθησης

VGG16
In [ ]:
VGG16_MODEL_OPTIMIZED = init_VGG16_model_optimized(True)
VGG16_MODEL_OPTIMIZED_history = train_model(VGG16_MODEL_OPTIMIZED, epochs = 200, callbacks = [callback])
Model: "sequential_9"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
vgg16 (Functional)           (None, 1, 1, 512)         14714688  
_________________________________________________________________
dropout_27 (Dropout)         (None, 1, 1, 512)         0         
_________________________________________________________________
global_average_pooling2d_3 ( (None, 512)               0         
_________________________________________________________________
dense_15 (Dense)             (None, 20)                10260     
=================================================================
Total params: 14,724,948
Trainable params: 14,724,948
Non-trainable params: 0
_________________________________________________________________
Epoch 1/200
67/67 [==============================] - 4s 37ms/step - loss: 2.7964 - accuracy: 0.1588 - val_loss: 1.6078 - val_accuracy: 0.5234
Epoch 2/200
67/67 [==============================] - 2s 35ms/step - loss: 1.5624 - accuracy: 0.5330 - val_loss: 1.1925 - val_accuracy: 0.6452
Epoch 3/200
67/67 [==============================] - 2s 35ms/step - loss: 1.1071 - accuracy: 0.6697 - val_loss: 0.9988 - val_accuracy: 0.6999
Epoch 4/200
67/67 [==============================] - 2s 35ms/step - loss: 0.7760 - accuracy: 0.7697 - val_loss: 0.9374 - val_accuracy: 0.7318
Epoch 5/200
67/67 [==============================] - 2s 35ms/step - loss: 0.5872 - accuracy: 0.8262 - val_loss: 0.9298 - val_accuracy: 0.7344
Epoch 6/200
67/67 [==============================] - 2s 35ms/step - loss: 0.4620 - accuracy: 0.8581 - val_loss: 0.8596 - val_accuracy: 0.7487
Epoch 7/200
67/67 [==============================] - 2s 35ms/step - loss: 0.3476 - accuracy: 0.8945 - val_loss: 0.9117 - val_accuracy: 0.7598
Epoch 8/200
67/67 [==============================] - 2s 35ms/step - loss: 0.2180 - accuracy: 0.9303 - val_loss: 1.0173 - val_accuracy: 0.7461
Epoch 9/200
67/67 [==============================] - 2s 35ms/step - loss: 0.1601 - accuracy: 0.9497 - val_loss: 1.0211 - val_accuracy: 0.7598
Epoch 10/200
67/67 [==============================] - 2s 35ms/step - loss: 0.1248 - accuracy: 0.9607 - val_loss: 1.0382 - val_accuracy: 0.7624
Epoch 11/200
67/67 [==============================] - 2s 35ms/step - loss: 0.0929 - accuracy: 0.9701 - val_loss: 1.1810 - val_accuracy: 0.7552
Epoch 12/200
67/67 [==============================] - 2s 35ms/step - loss: 0.0912 - accuracy: 0.9720 - val_loss: 1.2306 - val_accuracy: 0.7552
Epoch 13/200
67/67 [==============================] - 2s 35ms/step - loss: 0.0868 - accuracy: 0.9759 - val_loss: 1.2568 - val_accuracy: 0.7493
Epoch 14/200
67/67 [==============================] - 2s 35ms/step - loss: 0.0551 - accuracy: 0.9828 - val_loss: 1.3480 - val_accuracy: 0.7721
Epoch 15/200
67/67 [==============================] - 2s 35ms/step - loss: 0.0582 - accuracy: 0.9815 - val_loss: 1.1899 - val_accuracy: 0.7604
Epoch 16/200
67/67 [==============================] - 2s 35ms/step - loss: 0.0401 - accuracy: 0.9869 - val_loss: 1.1490 - val_accuracy: 0.7819
Epoch 17/200
67/67 [==============================] - 2s 35ms/step - loss: 0.0584 - accuracy: 0.9840 - val_loss: 1.2914 - val_accuracy: 0.7520
Epoch 18/200
67/67 [==============================] - 2s 35ms/step - loss: 0.0553 - accuracy: 0.9860 - val_loss: 1.2604 - val_accuracy: 0.7734
Epoch 19/200
67/67 [==============================] - 2s 35ms/step - loss: 0.0529 - accuracy: 0.9845 - val_loss: 1.2962 - val_accuracy: 0.7773
Epoch 20/200
67/67 [==============================] - 2s 35ms/step - loss: 0.0333 - accuracy: 0.9888 - val_loss: 1.5102 - val_accuracy: 0.7520
Epoch 21/200
67/67 [==============================] - 2s 35ms/step - loss: 0.0402 - accuracy: 0.9863 - val_loss: 1.3591 - val_accuracy: 0.7454
Epoch 22/200
67/67 [==============================] - 2s 35ms/step - loss: 0.0396 - accuracy: 0.9890 - val_loss: 1.2904 - val_accuracy: 0.7786
Epoch 23/200
67/67 [==============================] - 2s 35ms/step - loss: 0.0170 - accuracy: 0.9945 - val_loss: 1.2454 - val_accuracy: 0.7682
Epoch 24/200
67/67 [==============================] - 2s 35ms/step - loss: 0.0249 - accuracy: 0.9917 - val_loss: 1.3767 - val_accuracy: 0.7507
Epoch 25/200
67/67 [==============================] - 2s 35ms/step - loss: 0.0385 - accuracy: 0.9885 - val_loss: 1.4445 - val_accuracy: 0.7493
Epoch 26/200
67/67 [==============================] - 2s 35ms/step - loss: 0.0397 - accuracy: 0.9885 - val_loss: 1.3211 - val_accuracy: 0.7624
In [ ]:
_, accuracy = model_report(VGG16_MODEL_OPTIMIZED, VGG16_MODEL_OPTIMIZED_history)
accuracies_opt_128["VGG_ALL"] = accuracy
Test set evaluation metrics
---------------------------
Loss:     0.829
Accuracy: 76.025%
MobileNet
In [ ]:
MobileNetV2_MODEL_OPTIMIZED = init_MobileNetV2_model_optimized(True)
MobileNetV2_MODEL_OPTIMIZED_history = train_model(MobileNetV2_MODEL_OPTIMIZED, train_dataset = train_ds_res, validation_dataset = validation_ds_res, epochs = 200, callbacks=[callback])
Model: "sequential"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
mobilenetv2_1.00_224 (Functi (None, 7, 7, 1280)        2257984   
_________________________________________________________________
dropout (Dropout)            (None, 7, 7, 1280)        0         
_________________________________________________________________
global_average_pooling2d (Gl (None, 1280)              0         
_________________________________________________________________
dense (Dense)                (None, 20)                25620     
=================================================================
Total params: 2,283,604
Trainable params: 2,249,492
Non-trainable params: 34,112
_________________________________________________________________
Epoch 1/200
67/67 [==============================] - 67s 901ms/step - loss: 2.2048 - accuracy: 0.3828 - val_loss: 2.6248 - val_accuracy: 0.4173
Epoch 2/200
67/67 [==============================] - 59s 879ms/step - loss: 0.4699 - accuracy: 0.8662 - val_loss: 2.2279 - val_accuracy: 0.4779
Epoch 3/200
67/67 [==============================] - 59s 880ms/step - loss: 0.1980 - accuracy: 0.9555 - val_loss: 1.8453 - val_accuracy: 0.5547
Epoch 4/200
67/67 [==============================] - 58s 870ms/step - loss: 0.0979 - accuracy: 0.9845 - val_loss: 1.8715 - val_accuracy: 0.5534
Epoch 5/200
67/67 [==============================] - 59s 875ms/step - loss: 0.0480 - accuracy: 0.9959 - val_loss: 1.9593 - val_accuracy: 0.5358
Epoch 6/200
67/67 [==============================] - 59s 886ms/step - loss: 0.0250 - accuracy: 0.9995 - val_loss: 2.0884 - val_accuracy: 0.5208
Epoch 7/200
67/67 [==============================] - 58s 871ms/step - loss: 0.0158 - accuracy: 0.9997 - val_loss: 2.1503 - val_accuracy: 0.5072
Epoch 8/200
67/67 [==============================] - 58s 873ms/step - loss: 0.0112 - accuracy: 1.0000 - val_loss: 2.2625 - val_accuracy: 0.5013
Epoch 9/200
67/67 [==============================] - 59s 881ms/step - loss: 0.0079 - accuracy: 0.9999 - val_loss: 2.5193 - val_accuracy: 0.4590
Epoch 10/200
67/67 [==============================] - 59s 880ms/step - loss: 0.0070 - accuracy: 1.0000 - val_loss: 2.7267 - val_accuracy: 0.4382
Epoch 11/200
67/67 [==============================] - 59s 881ms/step - loss: 0.0052 - accuracy: 1.0000 - val_loss: 2.9286 - val_accuracy: 0.4193
Epoch 12/200
67/67 [==============================] - 59s 882ms/step - loss: 0.0038 - accuracy: 1.0000 - val_loss: 3.1625 - val_accuracy: 0.3958
Epoch 13/200
67/67 [==============================] - 59s 883ms/step - loss: 0.0034 - accuracy: 1.0000 - val_loss: 3.2623 - val_accuracy: 0.3874
Epoch 14/200
67/67 [==============================] - 58s 872ms/step - loss: 0.0033 - accuracy: 1.0000 - val_loss: 3.4386 - val_accuracy: 0.3626
Epoch 15/200
67/67 [==============================] - 59s 876ms/step - loss: 0.0024 - accuracy: 0.9999 - val_loss: 3.4511 - val_accuracy: 0.3516
Epoch 16/200
67/67 [==============================] - 58s 866ms/step - loss: 0.0021 - accuracy: 1.0000 - val_loss: 3.4586 - val_accuracy: 0.3516
Epoch 17/200
67/67 [==============================] - 59s 879ms/step - loss: 0.0019 - accuracy: 1.0000 - val_loss: 3.5325 - val_accuracy: 0.3444
Epoch 18/200
67/67 [==============================] - 59s 881ms/step - loss: 0.0016 - accuracy: 1.0000 - val_loss: 3.5704 - val_accuracy: 0.3379
Epoch 19/200
67/67 [==============================] - 59s 879ms/step - loss: 0.0015 - accuracy: 1.0000 - val_loss: 3.5667 - val_accuracy: 0.3327
Epoch 20/200
67/67 [==============================] - 59s 877ms/step - loss: 0.0017 - accuracy: 1.0000 - val_loss: 3.5699 - val_accuracy: 0.3262
Epoch 21/200
67/67 [==============================] - 58s 873ms/step - loss: 0.0016 - accuracy: 1.0000 - val_loss: 3.6707 - val_accuracy: 0.3288
Epoch 22/200
67/67 [==============================] - 59s 878ms/step - loss: 0.0012 - accuracy: 1.0000 - val_loss: 3.5902 - val_accuracy: 0.3379
Epoch 23/200
67/67 [==============================] - 58s 873ms/step - loss: 0.0011 - accuracy: 1.0000 - val_loss: 3.5608 - val_accuracy: 0.3424
In [ ]:
_, accuracy = model_report(MobileNetV2_MODEL_OPTIMIZED, MobileNetV2_MODEL_OPTIMIZED_history, test_ds_res)
accuracies_opt_128["MOBILENET_ALL"] = accuracy
Test set evaluation metrics
---------------------------
Loss:     1.929
Accuracy: 53.613%
DenseNet
In [ ]:
DENSENET_MODEL_OPTIMIZED = init_DENSENET_model_optimized(True)
DENSENET_MODEL_OPTIMIZED_history = train_model(DENSENET_MODEL_OPTIMIZED, epochs = 200, callbacks=[callback])
Model: "sequential_11"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
densenet121 (Functional)     (None, 1, 1, 1024)        7037504   
_________________________________________________________________
dropout_29 (Dropout)         (None, 1, 1, 1024)        0         
_________________________________________________________________
global_average_pooling2d_5 ( (None, 1024)              0         
_________________________________________________________________
dense_17 (Dense)             (None, 20)                20500     
=================================================================
Total params: 7,058,004
Trainable params: 6,974,356
Non-trainable params: 83,648
_________________________________________________________________
Epoch 1/200
67/67 [==============================] - 15s 74ms/step - loss: 3.8231 - accuracy: 0.1097 - val_loss: 2.5018 - val_accuracy: 0.2845
Epoch 2/200
67/67 [==============================] - 3s 51ms/step - loss: 2.1188 - accuracy: 0.3584 - val_loss: 1.9713 - val_accuracy: 0.4733
Epoch 3/200
67/67 [==============================] - 3s 51ms/step - loss: 1.4452 - accuracy: 0.5742 - val_loss: 1.6064 - val_accuracy: 0.6061
Epoch 4/200
67/67 [==============================] - 3s 51ms/step - loss: 0.9955 - accuracy: 0.7017 - val_loss: 1.2890 - val_accuracy: 0.6641
Epoch 5/200
67/67 [==============================] - 3s 51ms/step - loss: 0.7799 - accuracy: 0.7619 - val_loss: 1.0963 - val_accuracy: 0.6921
Epoch 6/200
67/67 [==============================] - 3s 51ms/step - loss: 0.5291 - accuracy: 0.8426 - val_loss: 0.9745 - val_accuracy: 0.7214
Epoch 7/200
67/67 [==============================] - 3s 51ms/step - loss: 0.3813 - accuracy: 0.8850 - val_loss: 0.9492 - val_accuracy: 0.7279
Epoch 8/200
67/67 [==============================] - 3s 51ms/step - loss: 0.2736 - accuracy: 0.9217 - val_loss: 0.9540 - val_accuracy: 0.7259
Epoch 9/200
67/67 [==============================] - 3s 51ms/step - loss: 0.2006 - accuracy: 0.9463 - val_loss: 0.9513 - val_accuracy: 0.7363
Epoch 10/200
67/67 [==============================] - 3s 51ms/step - loss: 0.1432 - accuracy: 0.9578 - val_loss: 0.9706 - val_accuracy: 0.7396
Epoch 11/200
67/67 [==============================] - 3s 51ms/step - loss: 0.0983 - accuracy: 0.9781 - val_loss: 0.9972 - val_accuracy: 0.7370
Epoch 12/200
67/67 [==============================] - 3s 51ms/step - loss: 0.0726 - accuracy: 0.9870 - val_loss: 1.0000 - val_accuracy: 0.7441
Epoch 13/200
67/67 [==============================] - 3s 51ms/step - loss: 0.0612 - accuracy: 0.9863 - val_loss: 1.0184 - val_accuracy: 0.7435
Epoch 14/200
67/67 [==============================] - 3s 51ms/step - loss: 0.0491 - accuracy: 0.9921 - val_loss: 1.0421 - val_accuracy: 0.7487
Epoch 15/200
67/67 [==============================] - 3s 51ms/step - loss: 0.0414 - accuracy: 0.9901 - val_loss: 1.0405 - val_accuracy: 0.7415
Epoch 16/200
67/67 [==============================] - 3s 51ms/step - loss: 0.0335 - accuracy: 0.9933 - val_loss: 1.0943 - val_accuracy: 0.7389
Epoch 17/200
67/67 [==============================] - 3s 51ms/step - loss: 0.0213 - accuracy: 0.9971 - val_loss: 1.0602 - val_accuracy: 0.7539
Epoch 18/200
67/67 [==============================] - 3s 51ms/step - loss: 0.0175 - accuracy: 0.9976 - val_loss: 1.0818 - val_accuracy: 0.7513
Epoch 19/200
67/67 [==============================] - 3s 51ms/step - loss: 0.0250 - accuracy: 0.9943 - val_loss: 1.1030 - val_accuracy: 0.7487
Epoch 20/200
67/67 [==============================] - 3s 51ms/step - loss: 0.0234 - accuracy: 0.9970 - val_loss: 1.0844 - val_accuracy: 0.7454
Epoch 21/200
67/67 [==============================] - 3s 51ms/step - loss: 0.0250 - accuracy: 0.9958 - val_loss: 1.1294 - val_accuracy: 0.7435
Epoch 22/200
67/67 [==============================] - 3s 51ms/step - loss: 0.0181 - accuracy: 0.9974 - val_loss: 1.0887 - val_accuracy: 0.7461
Epoch 23/200
67/67 [==============================] - 3s 51ms/step - loss: 0.0153 - accuracy: 0.9971 - val_loss: 1.0970 - val_accuracy: 0.7565
Epoch 24/200
67/67 [==============================] - 3s 51ms/step - loss: 0.0146 - accuracy: 0.9973 - val_loss: 1.1285 - val_accuracy: 0.7441
Epoch 25/200
67/67 [==============================] - 3s 51ms/step - loss: 0.0240 - accuracy: 0.9945 - val_loss: 1.1171 - val_accuracy: 0.7474
Epoch 26/200
67/67 [==============================] - 3s 51ms/step - loss: 0.0177 - accuracy: 0.9958 - val_loss: 1.1530 - val_accuracy: 0.7507
Epoch 27/200
67/67 [==============================] - 3s 51ms/step - loss: 0.0139 - accuracy: 0.9962 - val_loss: 1.1662 - val_accuracy: 0.7467
In [ ]:
_, accuracy = model_report(DENSENET_MODEL_OPTIMIZED, DENSENET_MODEL_OPTIMIZED_history)
accuracies_opt_128["DENSENET_ALL"] = accuracy
Test set evaluation metrics
---------------------------
Loss:     0.985
Accuracy: 72.168%

Batch size = 200

In [ ]:
BATCH_SIZE = 200

def _input_fn(x,y, BATCH_SIZE):
  ds = tf.data.Dataset.from_tensor_slices((x,y))
  ds = ds.shuffle(buffer_size=data_size)
  ds = ds.repeat()
  ds = ds.batch(BATCH_SIZE)
  ds = ds.prefetch(buffer_size=AUTOTUNE)
  return ds

train_ds =_input_fn(x_train,y_train, BATCH_SIZE) #PrefetchDataset object
validation_ds =_input_fn(x_val,y_val, BATCH_SIZE) #PrefetchDataset object
test_ds =_input_fn(x_test,y_test, BATCH_SIZE) #PrefetchDataset object

train_ds_res = train_ds.map(resize_transform)
validation_ds_res = validation_ds.map(resize_transform)
test_ds_res = test_ds.map(resize_transform)

def train_model(model, train_dataset = train_ds, validation_dataset = validation_ds, epochs = 100, callbacks = None, steps_per_epoch = int(np.ceil(x_train.shape[0]/BATCH_SIZE)), validation_steps = int(np.ceil(x_val.shape[0]/BATCH_SIZE))):
  history = model.fit(train_dataset, epochs=epochs, steps_per_epoch=steps_per_epoch, validation_data=validation_dataset, validation_steps=validation_steps, callbacks=callbacks)
  return(history)

def model_report(model, history, evaluation_dataset = test_ds, evaluation_steps = int(np.ceil(x_test.shape[0]/BATCH_SIZE))):
      plt = summarize_diagnostics(history)
      plt.show()
      return model_evaluation(model, evaluation_dataset, evaluation_steps)

Δίκτυα "from scratch"

In [ ]:
accuracies_opt_200 = {}
Simple CNN
In [ ]:
SIMPLE_MODEL_OPTIMIZED = init_simple_model_optimized(summary = True)
SIMPLE_MODEL_OPTIMIZED_history = train_model(SIMPLE_MODEL_OPTIMIZED, epochs = 200, callbacks=[callback])
Model: "sequential_12"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
conv2d_20 (Conv2D)           (None, 30, 30, 32)        896       
_________________________________________________________________
batch_normalization_20 (Batc (None, 30, 30, 32)        128       
_________________________________________________________________
re_lu_20 (ReLU)              (None, 30, 30, 32)        0         
_________________________________________________________________
max_pooling2d_14 (MaxPooling (None, 15, 15, 32)        0         
_________________________________________________________________
dropout_30 (Dropout)         (None, 15, 15, 32)        0         
_________________________________________________________________
conv2d_21 (Conv2D)           (None, 13, 13, 64)        18496     
_________________________________________________________________
batch_normalization_21 (Batc (None, 13, 13, 64)        256       
_________________________________________________________________
re_lu_21 (ReLU)              (None, 13, 13, 64)        0         
_________________________________________________________________
max_pooling2d_15 (MaxPooling (None, 6, 6, 64)          0         
_________________________________________________________________
dropout_31 (Dropout)         (None, 6, 6, 64)          0         
_________________________________________________________________
conv2d_22 (Conv2D)           (None, 4, 4, 64)          36928     
_________________________________________________________________
batch_normalization_22 (Batc (None, 4, 4, 64)          256       
_________________________________________________________________
re_lu_22 (ReLU)              (None, 4, 4, 64)          0         
_________________________________________________________________
flatten_6 (Flatten)          (None, 1024)              0         
_________________________________________________________________
dropout_32 (Dropout)         (None, 1024)              0         
_________________________________________________________________
dense_18 (Dense)             (None, 64)                65600     
_________________________________________________________________
dense_19 (Dense)             (None, 20)                1300      
=================================================================
Total params: 123,860
Trainable params: 123,540
Non-trainable params: 320
_________________________________________________________________
Epoch 1/200
43/43 [==============================] - 1s 15ms/step - loss: 4.4096 - accuracy: 0.0592 - val_loss: 4.0970 - val_accuracy: 0.0531
Epoch 2/200
43/43 [==============================] - 0s 11ms/step - loss: 4.0633 - accuracy: 0.1018 - val_loss: 4.1154 - val_accuracy: 0.0637
Epoch 3/200
43/43 [==============================] - 0s 11ms/step - loss: 3.8908 - accuracy: 0.1436 - val_loss: 4.1788 - val_accuracy: 0.0819
Epoch 4/200
43/43 [==============================] - 0s 11ms/step - loss: 3.7698 - accuracy: 0.1686 - val_loss: 4.2573 - val_accuracy: 0.0769
Epoch 5/200
43/43 [==============================] - 0s 11ms/step - loss: 3.6877 - accuracy: 0.1860 - val_loss: 4.3239 - val_accuracy: 0.0800
Epoch 6/200
43/43 [==============================] - 0s 11ms/step - loss: 3.5871 - accuracy: 0.2242 - val_loss: 4.3681 - val_accuracy: 0.0800
Epoch 7/200
43/43 [==============================] - 0s 11ms/step - loss: 3.4948 - accuracy: 0.2423 - val_loss: 4.3618 - val_accuracy: 0.0781
Epoch 8/200
43/43 [==============================] - 0s 11ms/step - loss: 3.3809 - accuracy: 0.2705 - val_loss: 4.2358 - val_accuracy: 0.1006
Epoch 9/200
43/43 [==============================] - 0s 11ms/step - loss: 3.3074 - accuracy: 0.2784 - val_loss: 4.1190 - val_accuracy: 0.1031
Epoch 10/200
43/43 [==============================] - 0s 11ms/step - loss: 3.2557 - accuracy: 0.2939 - val_loss: 3.9033 - val_accuracy: 0.1219
Epoch 11/200
43/43 [==============================] - 0s 11ms/step - loss: 3.1762 - accuracy: 0.3162 - val_loss: 3.7054 - val_accuracy: 0.1581
Epoch 12/200
43/43 [==============================] - 0s 11ms/step - loss: 3.0725 - accuracy: 0.3384 - val_loss: 3.4954 - val_accuracy: 0.2037
Epoch 13/200
43/43 [==============================] - 0s 11ms/step - loss: 3.0297 - accuracy: 0.3449 - val_loss: 3.3181 - val_accuracy: 0.2488
Epoch 14/200
43/43 [==============================] - 0s 11ms/step - loss: 2.9473 - accuracy: 0.3615 - val_loss: 3.1973 - val_accuracy: 0.2850
Epoch 15/200
43/43 [==============================] - 0s 11ms/step - loss: 2.9099 - accuracy: 0.3652 - val_loss: 3.0556 - val_accuracy: 0.3275
Epoch 16/200
43/43 [==============================] - 0s 11ms/step - loss: 2.8399 - accuracy: 0.3871 - val_loss: 3.0531 - val_accuracy: 0.3250
Epoch 17/200
43/43 [==============================] - 0s 11ms/step - loss: 2.7913 - accuracy: 0.3882 - val_loss: 2.8931 - val_accuracy: 0.3575
Epoch 18/200
43/43 [==============================] - 0s 11ms/step - loss: 2.7293 - accuracy: 0.4091 - val_loss: 2.7906 - val_accuracy: 0.3831
Epoch 19/200
43/43 [==============================] - 0s 11ms/step - loss: 2.6567 - accuracy: 0.4107 - val_loss: 2.7785 - val_accuracy: 0.3900
Epoch 20/200
43/43 [==============================] - 0s 11ms/step - loss: 2.6129 - accuracy: 0.4180 - val_loss: 2.7976 - val_accuracy: 0.3762
Epoch 21/200
43/43 [==============================] - 0s 11ms/step - loss: 2.5726 - accuracy: 0.4347 - val_loss: 2.6279 - val_accuracy: 0.4200
Epoch 22/200
43/43 [==============================] - 0s 11ms/step - loss: 2.5425 - accuracy: 0.4452 - val_loss: 2.5990 - val_accuracy: 0.4300
Epoch 23/200
43/43 [==============================] - 0s 11ms/step - loss: 2.4804 - accuracy: 0.4424 - val_loss: 2.7138 - val_accuracy: 0.3844
Epoch 24/200
43/43 [==============================] - 0s 11ms/step - loss: 2.4715 - accuracy: 0.4483 - val_loss: 2.6603 - val_accuracy: 0.4000
Epoch 25/200
43/43 [==============================] - 0s 11ms/step - loss: 2.4194 - accuracy: 0.4629 - val_loss: 2.6402 - val_accuracy: 0.3956
Epoch 26/200
43/43 [==============================] - 0s 11ms/step - loss: 2.3849 - accuracy: 0.4641 - val_loss: 2.4976 - val_accuracy: 0.4450
Epoch 27/200
43/43 [==============================] - 0s 11ms/step - loss: 2.3334 - accuracy: 0.4779 - val_loss: 2.5292 - val_accuracy: 0.4331
Epoch 28/200
43/43 [==============================] - 0s 11ms/step - loss: 2.3032 - accuracy: 0.4849 - val_loss: 2.4885 - val_accuracy: 0.4494
Epoch 29/200
43/43 [==============================] - 0s 11ms/step - loss: 2.2775 - accuracy: 0.4871 - val_loss: 2.3697 - val_accuracy: 0.4794
Epoch 30/200
43/43 [==============================] - 0s 11ms/step - loss: 2.2444 - accuracy: 0.4913 - val_loss: 2.4796 - val_accuracy: 0.4369
Epoch 31/200
43/43 [==============================] - 0s 11ms/step - loss: 2.2054 - accuracy: 0.4993 - val_loss: 2.3555 - val_accuracy: 0.4781
Epoch 32/200
43/43 [==============================] - 0s 11ms/step - loss: 2.1671 - accuracy: 0.5141 - val_loss: 2.3091 - val_accuracy: 0.4906
Epoch 33/200
43/43 [==============================] - 0s 11ms/step - loss: 2.1486 - accuracy: 0.5190 - val_loss: 2.3496 - val_accuracy: 0.4756
Epoch 34/200
43/43 [==============================] - 0s 11ms/step - loss: 2.1423 - accuracy: 0.5097 - val_loss: 2.3473 - val_accuracy: 0.4700
Epoch 35/200
43/43 [==============================] - 0s 11ms/step - loss: 2.0816 - accuracy: 0.5273 - val_loss: 2.3788 - val_accuracy: 0.4675
Epoch 36/200
43/43 [==============================] - 0s 11ms/step - loss: 2.0738 - accuracy: 0.5262 - val_loss: 2.2289 - val_accuracy: 0.5050
Epoch 37/200
43/43 [==============================] - 0s 11ms/step - loss: 2.0570 - accuracy: 0.5231 - val_loss: 2.3698 - val_accuracy: 0.4613
Epoch 38/200
43/43 [==============================] - 0s 11ms/step - loss: 2.0517 - accuracy: 0.5313 - val_loss: 2.2260 - val_accuracy: 0.4944
Epoch 39/200
43/43 [==============================] - 0s 11ms/step - loss: 2.0063 - accuracy: 0.5302 - val_loss: 2.1201 - val_accuracy: 0.5150
Epoch 40/200
43/43 [==============================] - 0s 11ms/step - loss: 1.9908 - accuracy: 0.5426 - val_loss: 2.1520 - val_accuracy: 0.5013
Epoch 41/200
43/43 [==============================] - 0s 11ms/step - loss: 1.9817 - accuracy: 0.5377 - val_loss: 2.1197 - val_accuracy: 0.5200
Epoch 42/200
43/43 [==============================] - 0s 11ms/step - loss: 1.9279 - accuracy: 0.5510 - val_loss: 2.1836 - val_accuracy: 0.5013
Epoch 43/200
43/43 [==============================] - 0s 11ms/step - loss: 1.9229 - accuracy: 0.5598 - val_loss: 2.1082 - val_accuracy: 0.5181
Epoch 44/200
43/43 [==============================] - 0s 11ms/step - loss: 1.8699 - accuracy: 0.5646 - val_loss: 2.0631 - val_accuracy: 0.5250
Epoch 45/200
43/43 [==============================] - 0s 11ms/step - loss: 1.8648 - accuracy: 0.5646 - val_loss: 2.0949 - val_accuracy: 0.5150
Epoch 46/200
43/43 [==============================] - 0s 11ms/step - loss: 1.8383 - accuracy: 0.5763 - val_loss: 2.1496 - val_accuracy: 0.5019
Epoch 47/200
43/43 [==============================] - 0s 11ms/step - loss: 1.8286 - accuracy: 0.5751 - val_loss: 2.0911 - val_accuracy: 0.5125
Epoch 48/200
43/43 [==============================] - 0s 11ms/step - loss: 1.7996 - accuracy: 0.5825 - val_loss: 2.0743 - val_accuracy: 0.5100
Epoch 49/200
43/43 [==============================] - 0s 11ms/step - loss: 1.7873 - accuracy: 0.5863 - val_loss: 1.9529 - val_accuracy: 0.5325
Epoch 50/200
43/43 [==============================] - 0s 11ms/step - loss: 1.7988 - accuracy: 0.5756 - val_loss: 2.0703 - val_accuracy: 0.5081
Epoch 51/200
43/43 [==============================] - 0s 11ms/step - loss: 1.7519 - accuracy: 0.5786 - val_loss: 1.9570 - val_accuracy: 0.5456
Epoch 52/200
43/43 [==============================] - 0s 11ms/step - loss: 1.7418 - accuracy: 0.5897 - val_loss: 1.8946 - val_accuracy: 0.5556
Epoch 53/200
43/43 [==============================] - 0s 11ms/step - loss: 1.7151 - accuracy: 0.6016 - val_loss: 1.8887 - val_accuracy: 0.5675
Epoch 54/200
43/43 [==============================] - 0s 11ms/step - loss: 1.7233 - accuracy: 0.5964 - val_loss: 1.8886 - val_accuracy: 0.5631
Epoch 55/200
43/43 [==============================] - 0s 11ms/step - loss: 1.6871 - accuracy: 0.6063 - val_loss: 1.9397 - val_accuracy: 0.5400
Epoch 56/200
43/43 [==============================] - 0s 11ms/step - loss: 1.6874 - accuracy: 0.5976 - val_loss: 1.9680 - val_accuracy: 0.5394
Epoch 57/200
43/43 [==============================] - 0s 11ms/step - loss: 1.6628 - accuracy: 0.6055 - val_loss: 1.8283 - val_accuracy: 0.5694
Epoch 58/200
43/43 [==============================] - 0s 11ms/step - loss: 1.6217 - accuracy: 0.6147 - val_loss: 1.9315 - val_accuracy: 0.5312
Epoch 59/200
43/43 [==============================] - 0s 12ms/step - loss: 1.6248 - accuracy: 0.6117 - val_loss: 1.8844 - val_accuracy: 0.5475
Epoch 60/200
43/43 [==============================] - 0s 11ms/step - loss: 1.6050 - accuracy: 0.6147 - val_loss: 1.8171 - val_accuracy: 0.5500
Epoch 61/200
43/43 [==============================] - 0s 11ms/step - loss: 1.6013 - accuracy: 0.6142 - val_loss: 1.9549 - val_accuracy: 0.5269
Epoch 62/200
43/43 [==============================] - 0s 11ms/step - loss: 1.5649 - accuracy: 0.6214 - val_loss: 1.8628 - val_accuracy: 0.5512
Epoch 63/200
43/43 [==============================] - 0s 11ms/step - loss: 1.5728 - accuracy: 0.6246 - val_loss: 1.9229 - val_accuracy: 0.5350
Epoch 64/200
43/43 [==============================] - 0s 11ms/step - loss: 1.5306 - accuracy: 0.6327 - val_loss: 1.9206 - val_accuracy: 0.5394
Epoch 65/200
43/43 [==============================] - 0s 11ms/step - loss: 1.5635 - accuracy: 0.6134 - val_loss: 1.9423 - val_accuracy: 0.5319
Epoch 66/200
43/43 [==============================] - 0s 11ms/step - loss: 1.5386 - accuracy: 0.6346 - val_loss: 1.9777 - val_accuracy: 0.5194
Epoch 67/200
43/43 [==============================] - 0s 11ms/step - loss: 1.5124 - accuracy: 0.6353 - val_loss: 1.7952 - val_accuracy: 0.5587
Epoch 68/200
43/43 [==============================] - 0s 11ms/step - loss: 1.5178 - accuracy: 0.6266 - val_loss: 1.8127 - val_accuracy: 0.5619
Epoch 69/200
43/43 [==============================] - 0s 11ms/step - loss: 1.4763 - accuracy: 0.6418 - val_loss: 1.7789 - val_accuracy: 0.5669
Epoch 70/200
43/43 [==============================] - 0s 11ms/step - loss: 1.4483 - accuracy: 0.6417 - val_loss: 1.7745 - val_accuracy: 0.5744
Epoch 71/200
43/43 [==============================] - 0s 11ms/step - loss: 1.4727 - accuracy: 0.6417 - val_loss: 1.6911 - val_accuracy: 0.5881
Epoch 72/200
43/43 [==============================] - 0s 12ms/step - loss: 1.4387 - accuracy: 0.6538 - val_loss: 1.7603 - val_accuracy: 0.5675
Epoch 73/200
43/43 [==============================] - 0s 11ms/step - loss: 1.4186 - accuracy: 0.6506 - val_loss: 1.7346 - val_accuracy: 0.5938
Epoch 74/200
43/43 [==============================] - 0s 11ms/step - loss: 1.4482 - accuracy: 0.6370 - val_loss: 1.8668 - val_accuracy: 0.5450
Epoch 75/200
43/43 [==============================] - 1s 12ms/step - loss: 1.3858 - accuracy: 0.6548 - val_loss: 1.7823 - val_accuracy: 0.5619
Epoch 76/200
43/43 [==============================] - 0s 11ms/step - loss: 1.3847 - accuracy: 0.6673 - val_loss: 1.7143 - val_accuracy: 0.5919
Epoch 77/200
43/43 [==============================] - 0s 11ms/step - loss: 1.3746 - accuracy: 0.6636 - val_loss: 1.6969 - val_accuracy: 0.5863
Epoch 78/200
43/43 [==============================] - 0s 11ms/step - loss: 1.3549 - accuracy: 0.6671 - val_loss: 1.5917 - val_accuracy: 0.6100
Epoch 79/200
43/43 [==============================] - 0s 11ms/step - loss: 1.3405 - accuracy: 0.6815 - val_loss: 1.6810 - val_accuracy: 0.5831
Epoch 80/200
43/43 [==============================] - 0s 11ms/step - loss: 1.3559 - accuracy: 0.6659 - val_loss: 1.6972 - val_accuracy: 0.5881
Epoch 81/200
43/43 [==============================] - 0s 11ms/step - loss: 1.3140 - accuracy: 0.6780 - val_loss: 1.6202 - val_accuracy: 0.6019
Epoch 82/200
43/43 [==============================] - 0s 11ms/step - loss: 1.3295 - accuracy: 0.6663 - val_loss: 1.7017 - val_accuracy: 0.5788
Epoch 83/200
43/43 [==============================] - 0s 11ms/step - loss: 1.2988 - accuracy: 0.6797 - val_loss: 1.6212 - val_accuracy: 0.6119
Epoch 84/200
43/43 [==============================] - 0s 11ms/step - loss: 1.2829 - accuracy: 0.6879 - val_loss: 1.8318 - val_accuracy: 0.5600
Epoch 85/200
43/43 [==============================] - 0s 11ms/step - loss: 1.2808 - accuracy: 0.6868 - val_loss: 1.6490 - val_accuracy: 0.6069
Epoch 86/200
43/43 [==============================] - 0s 11ms/step - loss: 1.2944 - accuracy: 0.6842 - val_loss: 1.6749 - val_accuracy: 0.5950
Epoch 87/200
43/43 [==============================] - 0s 11ms/step - loss: 1.2533 - accuracy: 0.6851 - val_loss: 1.8310 - val_accuracy: 0.5469
Epoch 88/200
43/43 [==============================] - 0s 12ms/step - loss: 1.2246 - accuracy: 0.6954 - val_loss: 1.5785 - val_accuracy: 0.6194
Epoch 89/200
43/43 [==============================] - 0s 11ms/step - loss: 1.2346 - accuracy: 0.6898 - val_loss: 1.5812 - val_accuracy: 0.6137
Epoch 90/200
43/43 [==============================] - 1s 12ms/step - loss: 1.2430 - accuracy: 0.6935 - val_loss: 1.6810 - val_accuracy: 0.5769
Epoch 91/200
43/43 [==============================] - 0s 11ms/step - loss: 1.2131 - accuracy: 0.7024 - val_loss: 1.6386 - val_accuracy: 0.5900
Epoch 92/200
43/43 [==============================] - 0s 12ms/step - loss: 1.2137 - accuracy: 0.7046 - val_loss: 1.5898 - val_accuracy: 0.6031
Epoch 93/200
43/43 [==============================] - 0s 11ms/step - loss: 1.2142 - accuracy: 0.6981 - val_loss: 1.6701 - val_accuracy: 0.5869
Epoch 94/200
43/43 [==============================] - 0s 12ms/step - loss: 1.2079 - accuracy: 0.7000 - val_loss: 1.6936 - val_accuracy: 0.5850
Epoch 95/200
43/43 [==============================] - 0s 11ms/step - loss: 1.1903 - accuracy: 0.7086 - val_loss: 1.6467 - val_accuracy: 0.5931
Epoch 96/200
43/43 [==============================] - 0s 12ms/step - loss: 1.1919 - accuracy: 0.6985 - val_loss: 1.5762 - val_accuracy: 0.6050
Epoch 97/200
43/43 [==============================] - 0s 11ms/step - loss: 1.1607 - accuracy: 0.7142 - val_loss: 1.7201 - val_accuracy: 0.5888
Epoch 98/200
43/43 [==============================] - 0s 11ms/step - loss: 1.1641 - accuracy: 0.7213 - val_loss: 1.5086 - val_accuracy: 0.6250
Epoch 99/200
43/43 [==============================] - 0s 11ms/step - loss: 1.1324 - accuracy: 0.7169 - val_loss: 1.5893 - val_accuracy: 0.6025
Epoch 100/200
43/43 [==============================] - 0s 12ms/step - loss: 1.1552 - accuracy: 0.7092 - val_loss: 1.5273 - val_accuracy: 0.6206
Epoch 101/200
43/43 [==============================] - 0s 11ms/step - loss: 1.1487 - accuracy: 0.7072 - val_loss: 1.5501 - val_accuracy: 0.6169
Epoch 102/200
43/43 [==============================] - 0s 12ms/step - loss: 1.1413 - accuracy: 0.7065 - val_loss: 1.4446 - val_accuracy: 0.6431
Epoch 103/200
43/43 [==============================] - 0s 11ms/step - loss: 1.1111 - accuracy: 0.7243 - val_loss: 1.4764 - val_accuracy: 0.6331
Epoch 104/200
43/43 [==============================] - 0s 12ms/step - loss: 1.1146 - accuracy: 0.7179 - val_loss: 1.5462 - val_accuracy: 0.6125
Epoch 105/200
43/43 [==============================] - 0s 11ms/step - loss: 1.1124 - accuracy: 0.7251 - val_loss: 1.5021 - val_accuracy: 0.6219
Epoch 106/200
43/43 [==============================] - 0s 11ms/step - loss: 1.0698 - accuracy: 0.7273 - val_loss: 1.4889 - val_accuracy: 0.6225
Epoch 107/200
43/43 [==============================] - 0s 11ms/step - loss: 1.0779 - accuracy: 0.7230 - val_loss: 1.5533 - val_accuracy: 0.6194
Epoch 108/200
43/43 [==============================] - 0s 11ms/step - loss: 1.0747 - accuracy: 0.7286 - val_loss: 1.4439 - val_accuracy: 0.6438
Epoch 109/200
43/43 [==============================] - 0s 11ms/step - loss: 1.0844 - accuracy: 0.7249 - val_loss: 1.5500 - val_accuracy: 0.6119
Epoch 110/200
43/43 [==============================] - 0s 11ms/step - loss: 1.0656 - accuracy: 0.7274 - val_loss: 1.5033 - val_accuracy: 0.6219
Epoch 111/200
43/43 [==============================] - 0s 11ms/step - loss: 1.0642 - accuracy: 0.7325 - val_loss: 1.6289 - val_accuracy: 0.5938
Epoch 112/200
43/43 [==============================] - 0s 11ms/step - loss: 1.0514 - accuracy: 0.7308 - val_loss: 1.4589 - val_accuracy: 0.6431
Epoch 113/200
43/43 [==============================] - 0s 11ms/step - loss: 1.0566 - accuracy: 0.7331 - val_loss: 1.5012 - val_accuracy: 0.6263
Epoch 114/200
43/43 [==============================] - 0s 12ms/step - loss: 1.0611 - accuracy: 0.7267 - val_loss: 1.4457 - val_accuracy: 0.6438
Epoch 115/200
43/43 [==============================] - 0s 11ms/step - loss: 1.0453 - accuracy: 0.7341 - val_loss: 1.4752 - val_accuracy: 0.6438
Epoch 116/200
43/43 [==============================] - 1s 12ms/step - loss: 1.0307 - accuracy: 0.7370 - val_loss: 1.4682 - val_accuracy: 0.6363
Epoch 117/200
43/43 [==============================] - 0s 11ms/step - loss: 1.0209 - accuracy: 0.7386 - val_loss: 1.4987 - val_accuracy: 0.6200
Epoch 118/200
43/43 [==============================] - 0s 12ms/step - loss: 0.9923 - accuracy: 0.7458 - val_loss: 1.5326 - val_accuracy: 0.6169
Epoch 119/200
43/43 [==============================] - 0s 11ms/step - loss: 1.0259 - accuracy: 0.7383 - val_loss: 1.4031 - val_accuracy: 0.6556
Epoch 120/200
43/43 [==============================] - 0s 11ms/step - loss: 0.9878 - accuracy: 0.7460 - val_loss: 1.4512 - val_accuracy: 0.6356
Epoch 121/200
43/43 [==============================] - 0s 11ms/step - loss: 1.0168 - accuracy: 0.7391 - val_loss: 1.5726 - val_accuracy: 0.6112
Epoch 122/200
43/43 [==============================] - 0s 11ms/step - loss: 0.9910 - accuracy: 0.7457 - val_loss: 1.4427 - val_accuracy: 0.6425
Epoch 123/200
43/43 [==============================] - 0s 11ms/step - loss: 0.9940 - accuracy: 0.7503 - val_loss: 1.3838 - val_accuracy: 0.6594
Epoch 124/200
43/43 [==============================] - 0s 11ms/step - loss: 0.9926 - accuracy: 0.7460 - val_loss: 1.4221 - val_accuracy: 0.6450
Epoch 125/200
43/43 [==============================] - 0s 11ms/step - loss: 0.9610 - accuracy: 0.7593 - val_loss: 1.3992 - val_accuracy: 0.6612
Epoch 126/200
43/43 [==============================] - 0s 11ms/step - loss: 0.9924 - accuracy: 0.7366 - val_loss: 1.5029 - val_accuracy: 0.6200
Epoch 127/200
43/43 [==============================] - 0s 11ms/step - loss: 0.9575 - accuracy: 0.7529 - val_loss: 1.3636 - val_accuracy: 0.6606
Epoch 128/200
43/43 [==============================] - 0s 11ms/step - loss: 0.9487 - accuracy: 0.7581 - val_loss: 1.4402 - val_accuracy: 0.6331
Epoch 129/200
43/43 [==============================] - 0s 11ms/step - loss: 0.9394 - accuracy: 0.7555 - val_loss: 1.4079 - val_accuracy: 0.6606
Epoch 130/200
43/43 [==============================] - 0s 11ms/step - loss: 0.9374 - accuracy: 0.7637 - val_loss: 1.3469 - val_accuracy: 0.6594
Epoch 131/200
43/43 [==============================] - 0s 11ms/step - loss: 0.9258 - accuracy: 0.7633 - val_loss: 1.4623 - val_accuracy: 0.6406
Epoch 132/200
43/43 [==============================] - 0s 11ms/step - loss: 0.9213 - accuracy: 0.7693 - val_loss: 1.3552 - val_accuracy: 0.6631
Epoch 133/200
43/43 [==============================] - 1s 12ms/step - loss: 0.9487 - accuracy: 0.7502 - val_loss: 1.4612 - val_accuracy: 0.6281
Epoch 134/200
43/43 [==============================] - 0s 12ms/step - loss: 0.8987 - accuracy: 0.7683 - val_loss: 1.4265 - val_accuracy: 0.6425
Epoch 135/200
43/43 [==============================] - 0s 12ms/step - loss: 0.9135 - accuracy: 0.7652 - val_loss: 1.4515 - val_accuracy: 0.6300
Epoch 136/200
43/43 [==============================] - 0s 11ms/step - loss: 0.9058 - accuracy: 0.7652 - val_loss: 1.3993 - val_accuracy: 0.6469
Epoch 137/200
43/43 [==============================] - 0s 11ms/step - loss: 0.8980 - accuracy: 0.7679 - val_loss: 1.4435 - val_accuracy: 0.6350
Epoch 138/200
43/43 [==============================] - 1s 12ms/step - loss: 0.8893 - accuracy: 0.7723 - val_loss: 1.3537 - val_accuracy: 0.6587
Epoch 139/200
43/43 [==============================] - 0s 12ms/step - loss: 0.8961 - accuracy: 0.7724 - val_loss: 1.3527 - val_accuracy: 0.6575
Epoch 140/200
43/43 [==============================] - 1s 12ms/step - loss: 0.8862 - accuracy: 0.7624 - val_loss: 1.3644 - val_accuracy: 0.6531
Epoch 141/200
43/43 [==============================] - 1s 12ms/step - loss: 0.8630 - accuracy: 0.7799 - val_loss: 1.3767 - val_accuracy: 0.6481
Epoch 142/200
43/43 [==============================] - 1s 12ms/step - loss: 0.8703 - accuracy: 0.7709 - val_loss: 1.3622 - val_accuracy: 0.6600
Epoch 143/200
43/43 [==============================] - 1s 12ms/step - loss: 0.8721 - accuracy: 0.7679 - val_loss: 1.4578 - val_accuracy: 0.6237
Epoch 144/200
43/43 [==============================] - 0s 12ms/step - loss: 0.8621 - accuracy: 0.7812 - val_loss: 1.5176 - val_accuracy: 0.6212
Epoch 145/200
43/43 [==============================] - 1s 12ms/step - loss: 0.8604 - accuracy: 0.7712 - val_loss: 1.4790 - val_accuracy: 0.6275
Epoch 146/200
43/43 [==============================] - 1s 12ms/step - loss: 0.8546 - accuracy: 0.7837 - val_loss: 1.4290 - val_accuracy: 0.6313
Epoch 147/200
43/43 [==============================] - 0s 12ms/step - loss: 0.8541 - accuracy: 0.7707 - val_loss: 1.3414 - val_accuracy: 0.6637
Epoch 148/200
43/43 [==============================] - 1s 12ms/step - loss: 0.8393 - accuracy: 0.7821 - val_loss: 1.3712 - val_accuracy: 0.6550
Epoch 149/200
43/43 [==============================] - 0s 12ms/step - loss: 0.8397 - accuracy: 0.7804 - val_loss: 1.3884 - val_accuracy: 0.6500
Epoch 150/200
43/43 [==============================] - 1s 12ms/step - loss: 0.8663 - accuracy: 0.7682 - val_loss: 1.3968 - val_accuracy: 0.6488
Epoch 151/200
43/43 [==============================] - 0s 11ms/step - loss: 0.8191 - accuracy: 0.7898 - val_loss: 1.4354 - val_accuracy: 0.6319
Epoch 152/200
43/43 [==============================] - 0s 12ms/step - loss: 0.8546 - accuracy: 0.7797 - val_loss: 1.3946 - val_accuracy: 0.6612
Epoch 153/200
43/43 [==============================] - 1s 12ms/step - loss: 0.8439 - accuracy: 0.7732 - val_loss: 1.4509 - val_accuracy: 0.6331
Epoch 154/200
43/43 [==============================] - 1s 12ms/step - loss: 0.8217 - accuracy: 0.7883 - val_loss: 1.2603 - val_accuracy: 0.6781
Epoch 155/200
43/43 [==============================] - 1s 12ms/step - loss: 0.8026 - accuracy: 0.7874 - val_loss: 1.3423 - val_accuracy: 0.6606
Epoch 156/200
43/43 [==============================] - 0s 12ms/step - loss: 0.8256 - accuracy: 0.7853 - val_loss: 1.3135 - val_accuracy: 0.6619
Epoch 157/200
43/43 [==============================] - 0s 12ms/step - loss: 0.7949 - accuracy: 0.7970 - val_loss: 1.4232 - val_accuracy: 0.6506
Epoch 158/200
43/43 [==============================] - 0s 11ms/step - loss: 0.7962 - accuracy: 0.7956 - val_loss: 1.3728 - val_accuracy: 0.6594
Epoch 159/200
43/43 [==============================] - 0s 12ms/step - loss: 0.8037 - accuracy: 0.7874 - val_loss: 1.4587 - val_accuracy: 0.6356
Epoch 160/200
43/43 [==============================] - 1s 12ms/step - loss: 0.7798 - accuracy: 0.7939 - val_loss: 1.4242 - val_accuracy: 0.6431
Epoch 161/200
43/43 [==============================] - 1s 12ms/step - loss: 0.8031 - accuracy: 0.7895 - val_loss: 1.3780 - val_accuracy: 0.6644
Epoch 162/200
43/43 [==============================] - 1s 12ms/step - loss: 0.7848 - accuracy: 0.7937 - val_loss: 1.5085 - val_accuracy: 0.6194
Epoch 163/200
43/43 [==============================] - 1s 12ms/step - loss: 0.7720 - accuracy: 0.7947 - val_loss: 1.3787 - val_accuracy: 0.6556
Epoch 164/200
43/43 [==============================] - 0s 11ms/step - loss: 0.7686 - accuracy: 0.7954 - val_loss: 1.3684 - val_accuracy: 0.6438
Epoch 165/200
43/43 [==============================] - 0s 11ms/step - loss: 0.7686 - accuracy: 0.7971 - val_loss: 1.4432 - val_accuracy: 0.6431
Epoch 166/200
43/43 [==============================] - 1s 12ms/step - loss: 0.7755 - accuracy: 0.7958 - val_loss: 1.4140 - val_accuracy: 0.6475
Epoch 167/200
43/43 [==============================] - 1s 12ms/step - loss: 0.7793 - accuracy: 0.7980 - val_loss: 1.3837 - val_accuracy: 0.6550
Epoch 168/200
43/43 [==============================] - 0s 12ms/step - loss: 0.7491 - accuracy: 0.8068 - val_loss: 1.4855 - val_accuracy: 0.6338
Epoch 169/200
43/43 [==============================] - 0s 11ms/step - loss: 0.7656 - accuracy: 0.7979 - val_loss: 1.2859 - val_accuracy: 0.6800
Epoch 170/200
43/43 [==============================] - 1s 12ms/step - loss: 0.7473 - accuracy: 0.8095 - val_loss: 1.3139 - val_accuracy: 0.6744
Epoch 171/200
43/43 [==============================] - 1s 12ms/step - loss: 0.7329 - accuracy: 0.8104 - val_loss: 1.4068 - val_accuracy: 0.6444
Epoch 172/200
43/43 [==============================] - 0s 12ms/step - loss: 0.7598 - accuracy: 0.7956 - val_loss: 1.4612 - val_accuracy: 0.6381
Epoch 173/200
43/43 [==============================] - 0s 12ms/step - loss: 0.7339 - accuracy: 0.8069 - val_loss: 1.3085 - val_accuracy: 0.6669
Epoch 174/200
43/43 [==============================] - 0s 11ms/step - loss: 0.7338 - accuracy: 0.8042 - val_loss: 1.4241 - val_accuracy: 0.6556
In [ ]:
_, accuracy = model_report(SIMPLE_MODEL_OPTIMIZED, SIMPLE_MODEL_OPTIMIZED_history)
accuracies_opt_200["SIMPLE_MODEL"] = accuracy
Test set evaluation metrics
---------------------------
Loss:     1.285
Accuracy: 66.500%
CNN1
In [ ]:
CNN1_MODEL_OPTIMIZED = init_cnn1_model_optimized(summary = True)
CNN1_MODEL_OPTIMIZED_history = train_model(CNN1_MODEL_OPTIMIZED, epochs = 200, callbacks=[callback])
Model: "sequential_13"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
conv2d_23 (Conv2D)           (None, 30, 30, 32)        896       
_________________________________________________________________
batch_normalization_23 (Batc (None, 30, 30, 32)        128       
_________________________________________________________________
re_lu_23 (ReLU)              (None, 30, 30, 32)        0         
_________________________________________________________________
max_pooling2d_16 (MaxPooling (None, 15, 15, 32)        0         
_________________________________________________________________
dropout_33 (Dropout)         (None, 15, 15, 32)        0         
_________________________________________________________________
conv2d_24 (Conv2D)           (None, 13, 13, 64)        18496     
_________________________________________________________________
batch_normalization_24 (Batc (None, 13, 13, 64)        256       
_________________________________________________________________
re_lu_24 (ReLU)              (None, 13, 13, 64)        0         
_________________________________________________________________
max_pooling2d_17 (MaxPooling (None, 6, 6, 64)          0         
_________________________________________________________________
dropout_34 (Dropout)         (None, 6, 6, 64)          0         
_________________________________________________________________
conv2d_25 (Conv2D)           (None, 4, 4, 128)         73856     
_________________________________________________________________
batch_normalization_25 (Batc (None, 4, 4, 128)         512       
_________________________________________________________________
re_lu_25 (ReLU)              (None, 4, 4, 128)         0         
_________________________________________________________________
average_pooling2d_2 (Average (None, 2, 2, 128)         0         
_________________________________________________________________
dropout_35 (Dropout)         (None, 2, 2, 128)         0         
_________________________________________________________________
flatten_7 (Flatten)          (None, 512)               0         
_________________________________________________________________
dense_20 (Dense)             (None, 1024)              525312    
_________________________________________________________________
dropout_36 (Dropout)         (None, 1024)              0         
_________________________________________________________________
dense_21 (Dense)             (None, 20)                20500     
=================================================================
Total params: 639,956
Trainable params: 639,508
Non-trainable params: 448
_________________________________________________________________
Epoch 1/200
43/43 [==============================] - 1s 15ms/step - loss: 4.3104 - accuracy: 0.0952 - val_loss: 4.2731 - val_accuracy: 0.0519
Epoch 2/200
43/43 [==============================] - 0s 11ms/step - loss: 3.9135 - accuracy: 0.1982 - val_loss: 4.3692 - val_accuracy: 0.0631
Epoch 3/200
43/43 [==============================] - 0s 12ms/step - loss: 3.7270 - accuracy: 0.2389 - val_loss: 4.4933 - val_accuracy: 0.0731
Epoch 4/200
43/43 [==============================] - 0s 12ms/step - loss: 3.5656 - accuracy: 0.2783 - val_loss: 4.5783 - val_accuracy: 0.0681
Epoch 5/200
43/43 [==============================] - 1s 12ms/step - loss: 3.4276 - accuracy: 0.3059 - val_loss: 4.7071 - val_accuracy: 0.0619
Epoch 6/200
43/43 [==============================] - 1s 12ms/step - loss: 3.3020 - accuracy: 0.3267 - val_loss: 4.5471 - val_accuracy: 0.0819
Epoch 7/200
43/43 [==============================] - 1s 12ms/step - loss: 3.2162 - accuracy: 0.3431 - val_loss: 4.5421 - val_accuracy: 0.0894
Epoch 8/200
43/43 [==============================] - 1s 12ms/step - loss: 3.1026 - accuracy: 0.3633 - val_loss: 4.3365 - val_accuracy: 0.1138
Epoch 9/200
43/43 [==============================] - 1s 12ms/step - loss: 3.0142 - accuracy: 0.3750 - val_loss: 4.1390 - val_accuracy: 0.1356
Epoch 10/200
43/43 [==============================] - 1s 12ms/step - loss: 2.9409 - accuracy: 0.3916 - val_loss: 3.8770 - val_accuracy: 0.1462
Epoch 11/200
43/43 [==============================] - 1s 12ms/step - loss: 2.8199 - accuracy: 0.4136 - val_loss: 3.5734 - val_accuracy: 0.1994
Epoch 12/200
43/43 [==============================] - 1s 12ms/step - loss: 2.7651 - accuracy: 0.4255 - val_loss: 3.3890 - val_accuracy: 0.2438
Epoch 13/200
43/43 [==============================] - 1s 12ms/step - loss: 2.6857 - accuracy: 0.4328 - val_loss: 3.1751 - val_accuracy: 0.3069
Epoch 14/200
43/43 [==============================] - 1s 12ms/step - loss: 2.5977 - accuracy: 0.4542 - val_loss: 3.0928 - val_accuracy: 0.3150
Epoch 15/200
43/43 [==============================] - 1s 12ms/step - loss: 2.5843 - accuracy: 0.4498 - val_loss: 2.9185 - val_accuracy: 0.3606
Epoch 16/200
43/43 [==============================] - 1s 12ms/step - loss: 2.5066 - accuracy: 0.4652 - val_loss: 2.8069 - val_accuracy: 0.3906
Epoch 17/200
43/43 [==============================] - 0s 12ms/step - loss: 2.4549 - accuracy: 0.4784 - val_loss: 2.8386 - val_accuracy: 0.3725
Epoch 18/200
43/43 [==============================] - 1s 12ms/step - loss: 2.3824 - accuracy: 0.4946 - val_loss: 2.8740 - val_accuracy: 0.3556
Epoch 19/200
43/43 [==============================] - 0s 11ms/step - loss: 2.3448 - accuracy: 0.4966 - val_loss: 2.7626 - val_accuracy: 0.3950
Epoch 20/200
43/43 [==============================] - 0s 12ms/step - loss: 2.3251 - accuracy: 0.4912 - val_loss: 2.4860 - val_accuracy: 0.4525
Epoch 21/200
43/43 [==============================] - 1s 12ms/step - loss: 2.2793 - accuracy: 0.5040 - val_loss: 2.5492 - val_accuracy: 0.4306
Epoch 22/200
43/43 [==============================] - 1s 12ms/step - loss: 2.1960 - accuracy: 0.5246 - val_loss: 2.5928 - val_accuracy: 0.4206
Epoch 23/200
43/43 [==============================] - 0s 12ms/step - loss: 2.1841 - accuracy: 0.5192 - val_loss: 2.4185 - val_accuracy: 0.4594
Epoch 24/200
43/43 [==============================] - 1s 12ms/step - loss: 2.1309 - accuracy: 0.5295 - val_loss: 2.3682 - val_accuracy: 0.4681
Epoch 25/200
43/43 [==============================] - 1s 12ms/step - loss: 2.0890 - accuracy: 0.5492 - val_loss: 2.1894 - val_accuracy: 0.5113
Epoch 26/200
43/43 [==============================] - 1s 12ms/step - loss: 2.0813 - accuracy: 0.5354 - val_loss: 2.4492 - val_accuracy: 0.4469
Epoch 27/200
43/43 [==============================] - 1s 12ms/step - loss: 2.0486 - accuracy: 0.5444 - val_loss: 2.5150 - val_accuracy: 0.4181
Epoch 28/200
43/43 [==============================] - 1s 12ms/step - loss: 1.9911 - accuracy: 0.5594 - val_loss: 2.4360 - val_accuracy: 0.4338
Epoch 29/200
43/43 [==============================] - 0s 12ms/step - loss: 1.9489 - accuracy: 0.5661 - val_loss: 2.2314 - val_accuracy: 0.4969
Epoch 30/200
43/43 [==============================] - 1s 12ms/step - loss: 1.9309 - accuracy: 0.5636 - val_loss: 2.1881 - val_accuracy: 0.5044
Epoch 31/200
43/43 [==============================] - 1s 12ms/step - loss: 1.9300 - accuracy: 0.5721 - val_loss: 2.3185 - val_accuracy: 0.4519
Epoch 32/200
43/43 [==============================] - 1s 12ms/step - loss: 1.8777 - accuracy: 0.5801 - val_loss: 2.4745 - val_accuracy: 0.4331
Epoch 33/200
43/43 [==============================] - 0s 12ms/step - loss: 1.8741 - accuracy: 0.5702 - val_loss: 2.2959 - val_accuracy: 0.4531
Epoch 34/200
43/43 [==============================] - 1s 12ms/step - loss: 1.8288 - accuracy: 0.5805 - val_loss: 2.1299 - val_accuracy: 0.5181
Epoch 35/200
43/43 [==============================] - 1s 12ms/step - loss: 1.8105 - accuracy: 0.5921 - val_loss: 2.0896 - val_accuracy: 0.5150
Epoch 36/200
43/43 [==============================] - 1s 12ms/step - loss: 1.7954 - accuracy: 0.5892 - val_loss: 2.2726 - val_accuracy: 0.4663
Epoch 37/200
43/43 [==============================] - 0s 12ms/step - loss: 1.7596 - accuracy: 0.5932 - val_loss: 2.4163 - val_accuracy: 0.4456
Epoch 38/200
43/43 [==============================] - 0s 12ms/step - loss: 1.7296 - accuracy: 0.6025 - val_loss: 2.1041 - val_accuracy: 0.5169
Epoch 39/200
43/43 [==============================] - 0s 12ms/step - loss: 1.7456 - accuracy: 0.6059 - val_loss: 2.1121 - val_accuracy: 0.5031
Epoch 40/200
43/43 [==============================] - 1s 12ms/step - loss: 1.6879 - accuracy: 0.6116 - val_loss: 2.0387 - val_accuracy: 0.5263
Epoch 41/200
43/43 [==============================] - 1s 12ms/step - loss: 1.6571 - accuracy: 0.6078 - val_loss: 2.0934 - val_accuracy: 0.5031
Epoch 42/200
43/43 [==============================] - 1s 12ms/step - loss: 1.6339 - accuracy: 0.6223 - val_loss: 1.9758 - val_accuracy: 0.5288
Epoch 43/200
43/43 [==============================] - 0s 12ms/step - loss: 1.6266 - accuracy: 0.6249 - val_loss: 2.0443 - val_accuracy: 0.5244
Epoch 44/200
43/43 [==============================] - 1s 12ms/step - loss: 1.5930 - accuracy: 0.6321 - val_loss: 1.9353 - val_accuracy: 0.5331
Epoch 45/200
43/43 [==============================] - 1s 12ms/step - loss: 1.5744 - accuracy: 0.6214 - val_loss: 2.0011 - val_accuracy: 0.5250
Epoch 46/200
43/43 [==============================] - 1s 12ms/step - loss: 1.5439 - accuracy: 0.6349 - val_loss: 1.8766 - val_accuracy: 0.5544
Epoch 47/200
43/43 [==============================] - 0s 12ms/step - loss: 1.5448 - accuracy: 0.6375 - val_loss: 1.8844 - val_accuracy: 0.5487
Epoch 48/200
43/43 [==============================] - 1s 12ms/step - loss: 1.5391 - accuracy: 0.6258 - val_loss: 1.9439 - val_accuracy: 0.5362
Epoch 49/200
43/43 [==============================] - 1s 12ms/step - loss: 1.4932 - accuracy: 0.6386 - val_loss: 2.0003 - val_accuracy: 0.5219
Epoch 50/200
43/43 [==============================] - 1s 12ms/step - loss: 1.4987 - accuracy: 0.6397 - val_loss: 1.9219 - val_accuracy: 0.5394
Epoch 51/200
43/43 [==============================] - 0s 12ms/step - loss: 1.4539 - accuracy: 0.6591 - val_loss: 1.7709 - val_accuracy: 0.5806
Epoch 52/200
43/43 [==============================] - 1s 12ms/step - loss: 1.4453 - accuracy: 0.6627 - val_loss: 1.9603 - val_accuracy: 0.5200
Epoch 53/200
43/43 [==============================] - 1s 12ms/step - loss: 1.4325 - accuracy: 0.6614 - val_loss: 1.7303 - val_accuracy: 0.5781
Epoch 54/200
43/43 [==============================] - 1s 12ms/step - loss: 1.4121 - accuracy: 0.6594 - val_loss: 1.9399 - val_accuracy: 0.5369
Epoch 55/200
43/43 [==============================] - 0s 12ms/step - loss: 1.4053 - accuracy: 0.6692 - val_loss: 1.7054 - val_accuracy: 0.5881
Epoch 56/200
43/43 [==============================] - 1s 12ms/step - loss: 1.3958 - accuracy: 0.6592 - val_loss: 1.6835 - val_accuracy: 0.6006
Epoch 57/200
43/43 [==============================] - 0s 12ms/step - loss: 1.3904 - accuracy: 0.6657 - val_loss: 1.7026 - val_accuracy: 0.5825
Epoch 58/200
43/43 [==============================] - 1s 12ms/step - loss: 1.3848 - accuracy: 0.6664 - val_loss: 1.7115 - val_accuracy: 0.5800
Epoch 59/200
43/43 [==============================] - 0s 12ms/step - loss: 1.3513 - accuracy: 0.6720 - val_loss: 1.7224 - val_accuracy: 0.5819
Epoch 60/200
43/43 [==============================] - 1s 12ms/step - loss: 1.3552 - accuracy: 0.6739 - val_loss: 1.6971 - val_accuracy: 0.5850
Epoch 61/200
43/43 [==============================] - 1s 12ms/step - loss: 1.2969 - accuracy: 0.6878 - val_loss: 1.7713 - val_accuracy: 0.5750
Epoch 62/200
43/43 [==============================] - 1s 12ms/step - loss: 1.3378 - accuracy: 0.6679 - val_loss: 1.7691 - val_accuracy: 0.5713
Epoch 63/200
43/43 [==============================] - 1s 12ms/step - loss: 1.2927 - accuracy: 0.6769 - val_loss: 1.7241 - val_accuracy: 0.5731
Epoch 64/200
43/43 [==============================] - 1s 12ms/step - loss: 1.2455 - accuracy: 0.6987 - val_loss: 1.7341 - val_accuracy: 0.5813
Epoch 65/200
43/43 [==============================] - 1s 12ms/step - loss: 1.2971 - accuracy: 0.6821 - val_loss: 1.6312 - val_accuracy: 0.6081
Epoch 66/200
43/43 [==============================] - 1s 12ms/step - loss: 1.2597 - accuracy: 0.6877 - val_loss: 1.6720 - val_accuracy: 0.5906
Epoch 67/200
43/43 [==============================] - 1s 12ms/step - loss: 1.2539 - accuracy: 0.6986 - val_loss: 1.7389 - val_accuracy: 0.5781
Epoch 68/200
43/43 [==============================] - 1s 12ms/step - loss: 1.2427 - accuracy: 0.6980 - val_loss: 1.5956 - val_accuracy: 0.6081
Epoch 69/200
43/43 [==============================] - 1s 12ms/step - loss: 1.2132 - accuracy: 0.6942 - val_loss: 1.6362 - val_accuracy: 0.6081
Epoch 70/200
43/43 [==============================] - 1s 12ms/step - loss: 1.2123 - accuracy: 0.7008 - val_loss: 1.5788 - val_accuracy: 0.6037
Epoch 71/200
43/43 [==============================] - 1s 12ms/step - loss: 1.2162 - accuracy: 0.7026 - val_loss: 1.6584 - val_accuracy: 0.5925
Epoch 72/200
43/43 [==============================] - 1s 12ms/step - loss: 1.1803 - accuracy: 0.7109 - val_loss: 1.5415 - val_accuracy: 0.6100
Epoch 73/200
43/43 [==============================] - 1s 12ms/step - loss: 1.1650 - accuracy: 0.7159 - val_loss: 1.6265 - val_accuracy: 0.5975
Epoch 74/200
43/43 [==============================] - 1s 12ms/step - loss: 1.1783 - accuracy: 0.7132 - val_loss: 1.4971 - val_accuracy: 0.6237
Epoch 75/200
43/43 [==============================] - 1s 12ms/step - loss: 1.1786 - accuracy: 0.7016 - val_loss: 1.6165 - val_accuracy: 0.6031
Epoch 76/200
43/43 [==============================] - 1s 12ms/step - loss: 1.1420 - accuracy: 0.7213 - val_loss: 1.5485 - val_accuracy: 0.6187
Epoch 77/200
43/43 [==============================] - 1s 12ms/step - loss: 1.1330 - accuracy: 0.7159 - val_loss: 1.5662 - val_accuracy: 0.6087
Epoch 78/200
43/43 [==============================] - 1s 12ms/step - loss: 1.1238 - accuracy: 0.7214 - val_loss: 1.5727 - val_accuracy: 0.6244
Epoch 79/200
43/43 [==============================] - 1s 12ms/step - loss: 1.1318 - accuracy: 0.7145 - val_loss: 1.4982 - val_accuracy: 0.6344
Epoch 80/200
43/43 [==============================] - 0s 12ms/step - loss: 1.0968 - accuracy: 0.7244 - val_loss: 1.7520 - val_accuracy: 0.5744
Epoch 81/200
43/43 [==============================] - 1s 12ms/step - loss: 1.1070 - accuracy: 0.7185 - val_loss: 1.6277 - val_accuracy: 0.5931
Epoch 82/200
43/43 [==============================] - 1s 12ms/step - loss: 1.0916 - accuracy: 0.7271 - val_loss: 1.4622 - val_accuracy: 0.6413
Epoch 83/200
43/43 [==============================] - 1s 12ms/step - loss: 1.0643 - accuracy: 0.7327 - val_loss: 1.4679 - val_accuracy: 0.6431
Epoch 84/200
43/43 [==============================] - 0s 12ms/step - loss: 1.0715 - accuracy: 0.7311 - val_loss: 1.4788 - val_accuracy: 0.6313
Epoch 85/200
43/43 [==============================] - 1s 12ms/step - loss: 1.0718 - accuracy: 0.7292 - val_loss: 1.4864 - val_accuracy: 0.6275
Epoch 86/200
43/43 [==============================] - 0s 12ms/step - loss: 1.0208 - accuracy: 0.7437 - val_loss: 1.5454 - val_accuracy: 0.6263
Epoch 87/200
43/43 [==============================] - 1s 12ms/step - loss: 1.0551 - accuracy: 0.7341 - val_loss: 1.4575 - val_accuracy: 0.6406
Epoch 88/200
43/43 [==============================] - 1s 12ms/step - loss: 1.0400 - accuracy: 0.7332 - val_loss: 1.4073 - val_accuracy: 0.6419
Epoch 89/200
43/43 [==============================] - 1s 12ms/step - loss: 1.0174 - accuracy: 0.7471 - val_loss: 1.4936 - val_accuracy: 0.6225
Epoch 90/200
43/43 [==============================] - 1s 12ms/step - loss: 1.0257 - accuracy: 0.7384 - val_loss: 1.5014 - val_accuracy: 0.6275
Epoch 91/200
43/43 [==============================] - 1s 12ms/step - loss: 0.9989 - accuracy: 0.7516 - val_loss: 1.4528 - val_accuracy: 0.6456
Epoch 92/200
43/43 [==============================] - 1s 12ms/step - loss: 1.0017 - accuracy: 0.7522 - val_loss: 1.4761 - val_accuracy: 0.6400
Epoch 93/200
43/43 [==============================] - 1s 12ms/step - loss: 1.0015 - accuracy: 0.7447 - val_loss: 1.4884 - val_accuracy: 0.6181
Epoch 94/200
43/43 [==============================] - 1s 12ms/step - loss: 0.9950 - accuracy: 0.7428 - val_loss: 1.3980 - val_accuracy: 0.6481
Epoch 95/200
43/43 [==============================] - 1s 12ms/step - loss: 0.9897 - accuracy: 0.7517 - val_loss: 1.4169 - val_accuracy: 0.6519
Epoch 96/200
43/43 [==============================] - 1s 12ms/step - loss: 0.9721 - accuracy: 0.7533 - val_loss: 1.5583 - val_accuracy: 0.6062
Epoch 97/200
43/43 [==============================] - 1s 12ms/step - loss: 0.9618 - accuracy: 0.7538 - val_loss: 1.4057 - val_accuracy: 0.6438
Epoch 98/200
43/43 [==============================] - 1s 12ms/step - loss: 0.9440 - accuracy: 0.7669 - val_loss: 1.5023 - val_accuracy: 0.6263
Epoch 99/200
43/43 [==============================] - 1s 12ms/step - loss: 0.9689 - accuracy: 0.7433 - val_loss: 1.5373 - val_accuracy: 0.6250
Epoch 100/200
43/43 [==============================] - 1s 12ms/step - loss: 0.9550 - accuracy: 0.7527 - val_loss: 1.4713 - val_accuracy: 0.6119
Epoch 101/200
43/43 [==============================] - 0s 12ms/step - loss: 0.9578 - accuracy: 0.7554 - val_loss: 1.3138 - val_accuracy: 0.6650
Epoch 102/200
43/43 [==============================] - 0s 11ms/step - loss: 0.9032 - accuracy: 0.7778 - val_loss: 1.3863 - val_accuracy: 0.6481
Epoch 103/200
43/43 [==============================] - 1s 12ms/step - loss: 0.8977 - accuracy: 0.7656 - val_loss: 1.3827 - val_accuracy: 0.6500
Epoch 104/200
43/43 [==============================] - 1s 12ms/step - loss: 0.8910 - accuracy: 0.7741 - val_loss: 1.2814 - val_accuracy: 0.6844
Epoch 105/200
43/43 [==============================] - 1s 12ms/step - loss: 0.9110 - accuracy: 0.7681 - val_loss: 1.3826 - val_accuracy: 0.6612
Epoch 106/200
43/43 [==============================] - 1s 12ms/step - loss: 0.8975 - accuracy: 0.7690 - val_loss: 1.4234 - val_accuracy: 0.6413
Epoch 107/200
43/43 [==============================] - 1s 12ms/step - loss: 0.8993 - accuracy: 0.7674 - val_loss: 1.3418 - val_accuracy: 0.6612
Epoch 108/200
43/43 [==============================] - 1s 12ms/step - loss: 0.8882 - accuracy: 0.7749 - val_loss: 1.3047 - val_accuracy: 0.6531
Epoch 109/200
43/43 [==============================] - 1s 12ms/step - loss: 0.8696 - accuracy: 0.7762 - val_loss: 1.4056 - val_accuracy: 0.6581
Epoch 110/200
43/43 [==============================] - 1s 12ms/step - loss: 0.8658 - accuracy: 0.7877 - val_loss: 1.3301 - val_accuracy: 0.6687
Epoch 111/200
43/43 [==============================] - 1s 12ms/step - loss: 0.8488 - accuracy: 0.7846 - val_loss: 1.4527 - val_accuracy: 0.6250
Epoch 112/200
43/43 [==============================] - 1s 12ms/step - loss: 0.8564 - accuracy: 0.7849 - val_loss: 1.4185 - val_accuracy: 0.6456
Epoch 113/200
43/43 [==============================] - 1s 12ms/step - loss: 0.8417 - accuracy: 0.7880 - val_loss: 1.3235 - val_accuracy: 0.6662
Epoch 114/200
43/43 [==============================] - 1s 12ms/step - loss: 0.8642 - accuracy: 0.7846 - val_loss: 1.2682 - val_accuracy: 0.6694
Epoch 115/200
43/43 [==============================] - 1s 12ms/step - loss: 0.8420 - accuracy: 0.7843 - val_loss: 1.2746 - val_accuracy: 0.6769
Epoch 116/200
43/43 [==============================] - 1s 12ms/step - loss: 0.8346 - accuracy: 0.7799 - val_loss: 1.4088 - val_accuracy: 0.6562
Epoch 117/200
43/43 [==============================] - 0s 12ms/step - loss: 0.8281 - accuracy: 0.7884 - val_loss: 1.2726 - val_accuracy: 0.6744
Epoch 118/200
43/43 [==============================] - 1s 12ms/step - loss: 0.8281 - accuracy: 0.7864 - val_loss: 1.3672 - val_accuracy: 0.6612
Epoch 119/200
43/43 [==============================] - 1s 12ms/step - loss: 0.8460 - accuracy: 0.7805 - val_loss: 1.3520 - val_accuracy: 0.6481
Epoch 120/200
43/43 [==============================] - 1s 13ms/step - loss: 0.8187 - accuracy: 0.7839 - val_loss: 1.3069 - val_accuracy: 0.6619
Epoch 121/200
43/43 [==============================] - 1s 12ms/step - loss: 0.7956 - accuracy: 0.7954 - val_loss: 1.4244 - val_accuracy: 0.6456
Epoch 122/200
43/43 [==============================] - 1s 12ms/step - loss: 0.8028 - accuracy: 0.7933 - val_loss: 1.4127 - val_accuracy: 0.6506
Epoch 123/200
43/43 [==============================] - 1s 12ms/step - loss: 0.7877 - accuracy: 0.7899 - val_loss: 1.3869 - val_accuracy: 0.6488
Epoch 124/200
43/43 [==============================] - 1s 12ms/step - loss: 0.7976 - accuracy: 0.7973 - val_loss: 1.4273 - val_accuracy: 0.6381
Epoch 125/200
43/43 [==============================] - 1s 12ms/step - loss: 0.7741 - accuracy: 0.7980 - val_loss: 1.3748 - val_accuracy: 0.6687
Epoch 126/200
43/43 [==============================] - 1s 12ms/step - loss: 0.7496 - accuracy: 0.8031 - val_loss: 1.2409 - val_accuracy: 0.6925
Epoch 127/200
43/43 [==============================] - 1s 12ms/step - loss: 0.7829 - accuracy: 0.7997 - val_loss: 1.3208 - val_accuracy: 0.6644
Epoch 128/200
43/43 [==============================] - 1s 12ms/step - loss: 0.7753 - accuracy: 0.8024 - val_loss: 1.2862 - val_accuracy: 0.6719
Epoch 129/200
43/43 [==============================] - 1s 12ms/step - loss: 0.7549 - accuracy: 0.8081 - val_loss: 1.3119 - val_accuracy: 0.6637
Epoch 130/200
43/43 [==============================] - 1s 12ms/step - loss: 0.7660 - accuracy: 0.7968 - val_loss: 1.3446 - val_accuracy: 0.6581
Epoch 131/200
43/43 [==============================] - 1s 12ms/step - loss: 0.7371 - accuracy: 0.8136 - val_loss: 1.2341 - val_accuracy: 0.6850
Epoch 132/200
43/43 [==============================] - 1s 12ms/step - loss: 0.7558 - accuracy: 0.8061 - val_loss: 1.2587 - val_accuracy: 0.6787
Epoch 133/200
43/43 [==============================] - 1s 12ms/step - loss: 0.7512 - accuracy: 0.8059 - val_loss: 1.2021 - val_accuracy: 0.6969
Epoch 134/200
43/43 [==============================] - 1s 12ms/step - loss: 0.7510 - accuracy: 0.8095 - val_loss: 1.3268 - val_accuracy: 0.6637
Epoch 135/200
43/43 [==============================] - 1s 12ms/step - loss: 0.7359 - accuracy: 0.8110 - val_loss: 1.2484 - val_accuracy: 0.6806
Epoch 136/200
43/43 [==============================] - 1s 12ms/step - loss: 0.7398 - accuracy: 0.8099 - val_loss: 1.2708 - val_accuracy: 0.6762
Epoch 137/200
43/43 [==============================] - 1s 12ms/step - loss: 0.7146 - accuracy: 0.8180 - val_loss: 1.3183 - val_accuracy: 0.6625
Epoch 138/200
43/43 [==============================] - 1s 12ms/step - loss: 0.7209 - accuracy: 0.8116 - val_loss: 1.3793 - val_accuracy: 0.6556
Epoch 139/200
43/43 [==============================] - 1s 12ms/step - loss: 0.7408 - accuracy: 0.8048 - val_loss: 1.3529 - val_accuracy: 0.6513
Epoch 140/200
43/43 [==============================] - 1s 12ms/step - loss: 0.7104 - accuracy: 0.8140 - val_loss: 1.3553 - val_accuracy: 0.6612
Epoch 141/200
43/43 [==============================] - 1s 12ms/step - loss: 0.6878 - accuracy: 0.8263 - val_loss: 1.2038 - val_accuracy: 0.6963
Epoch 142/200
43/43 [==============================] - 1s 12ms/step - loss: 0.6937 - accuracy: 0.8237 - val_loss: 1.4136 - val_accuracy: 0.6494
Epoch 143/200
43/43 [==============================] - 1s 12ms/step - loss: 0.6915 - accuracy: 0.8261 - val_loss: 1.2861 - val_accuracy: 0.6806
Epoch 144/200
43/43 [==============================] - 1s 12ms/step - loss: 0.6914 - accuracy: 0.8256 - val_loss: 1.2849 - val_accuracy: 0.6781
Epoch 145/200
43/43 [==============================] - 1s 12ms/step - loss: 0.6887 - accuracy: 0.8219 - val_loss: 1.2225 - val_accuracy: 0.6894
Epoch 146/200
43/43 [==============================] - 1s 12ms/step - loss: 0.6709 - accuracy: 0.8283 - val_loss: 1.3365 - val_accuracy: 0.6719
Epoch 147/200
43/43 [==============================] - 1s 12ms/step - loss: 0.6573 - accuracy: 0.8340 - val_loss: 1.1604 - val_accuracy: 0.7050
Epoch 148/200
43/43 [==============================] - 1s 12ms/step - loss: 0.6916 - accuracy: 0.8190 - val_loss: 1.2378 - val_accuracy: 0.6981
Epoch 149/200
43/43 [==============================] - 1s 12ms/step - loss: 0.6600 - accuracy: 0.8287 - val_loss: 1.2997 - val_accuracy: 0.6806
Epoch 150/200
43/43 [==============================] - 1s 12ms/step - loss: 0.6754 - accuracy: 0.8212 - val_loss: 1.2944 - val_accuracy: 0.6862
Epoch 151/200
43/43 [==============================] - 1s 12ms/step - loss: 0.6607 - accuracy: 0.8239 - val_loss: 1.3248 - val_accuracy: 0.6700
Epoch 152/200
43/43 [==============================] - 1s 12ms/step - loss: 0.6565 - accuracy: 0.8335 - val_loss: 1.3550 - val_accuracy: 0.6694
Epoch 153/200
43/43 [==============================] - 1s 12ms/step - loss: 0.6534 - accuracy: 0.8332 - val_loss: 1.3389 - val_accuracy: 0.6769
Epoch 154/200
43/43 [==============================] - 1s 12ms/step - loss: 0.6780 - accuracy: 0.8185 - val_loss: 1.3228 - val_accuracy: 0.6737
Epoch 155/200
43/43 [==============================] - 1s 12ms/step - loss: 0.6462 - accuracy: 0.8355 - val_loss: 1.2657 - val_accuracy: 0.6881
Epoch 156/200
43/43 [==============================] - 1s 12ms/step - loss: 0.6398 - accuracy: 0.8373 - val_loss: 1.2544 - val_accuracy: 0.6737
Epoch 157/200
43/43 [==============================] - 1s 12ms/step - loss: 0.6469 - accuracy: 0.8332 - val_loss: 1.2354 - val_accuracy: 0.6844
Epoch 158/200
43/43 [==============================] - 1s 12ms/step - loss: 0.6168 - accuracy: 0.8459 - val_loss: 1.1784 - val_accuracy: 0.7000
Epoch 159/200
43/43 [==============================] - 1s 13ms/step - loss: 0.6447 - accuracy: 0.8359 - val_loss: 1.1987 - val_accuracy: 0.6894
Epoch 160/200
43/43 [==============================] - 1s 12ms/step - loss: 0.6295 - accuracy: 0.8340 - val_loss: 1.3900 - val_accuracy: 0.6562
Epoch 161/200
43/43 [==============================] - 1s 12ms/step - loss: 0.6162 - accuracy: 0.8405 - val_loss: 1.2337 - val_accuracy: 0.6881
Epoch 162/200
43/43 [==============================] - 1s 12ms/step - loss: 0.6161 - accuracy: 0.8385 - val_loss: 1.2325 - val_accuracy: 0.6888
Epoch 163/200
43/43 [==============================] - 1s 12ms/step - loss: 0.6049 - accuracy: 0.8470 - val_loss: 1.2601 - val_accuracy: 0.6787
Epoch 164/200
43/43 [==============================] - 1s 12ms/step - loss: 0.6205 - accuracy: 0.8411 - val_loss: 1.1590 - val_accuracy: 0.7013
Epoch 165/200
43/43 [==============================] - 1s 12ms/step - loss: 0.6020 - accuracy: 0.8478 - val_loss: 1.2082 - val_accuracy: 0.6850
Epoch 166/200
43/43 [==============================] - 1s 12ms/step - loss: 0.6248 - accuracy: 0.8392 - val_loss: 1.2758 - val_accuracy: 0.6800
Epoch 167/200
43/43 [==============================] - 1s 12ms/step - loss: 0.6041 - accuracy: 0.8480 - val_loss: 1.1789 - val_accuracy: 0.7138
Epoch 168/200
43/43 [==============================] - 1s 12ms/step - loss: 0.5866 - accuracy: 0.8500 - val_loss: 1.2827 - val_accuracy: 0.6913
Epoch 169/200
43/43 [==============================] - 1s 12ms/step - loss: 0.5859 - accuracy: 0.8538 - val_loss: 1.3328 - val_accuracy: 0.6587
Epoch 170/200
43/43 [==============================] - 1s 12ms/step - loss: 0.5683 - accuracy: 0.8582 - val_loss: 1.2766 - val_accuracy: 0.6800
Epoch 171/200
43/43 [==============================] - 1s 12ms/step - loss: 0.5658 - accuracy: 0.8586 - val_loss: 1.2180 - val_accuracy: 0.6956
Epoch 172/200
43/43 [==============================] - 1s 12ms/step - loss: 0.5926 - accuracy: 0.8446 - val_loss: 1.2765 - val_accuracy: 0.6806
Epoch 173/200
43/43 [==============================] - 1s 12ms/step - loss: 0.5876 - accuracy: 0.8463 - val_loss: 1.1896 - val_accuracy: 0.7013
Epoch 174/200
43/43 [==============================] - 1s 12ms/step - loss: 0.5726 - accuracy: 0.8579 - val_loss: 1.2362 - val_accuracy: 0.6956
Epoch 175/200
43/43 [==============================] - 1s 12ms/step - loss: 0.5623 - accuracy: 0.8567 - val_loss: 1.1910 - val_accuracy: 0.7106
Epoch 176/200
43/43 [==============================] - 1s 12ms/step - loss: 0.5679 - accuracy: 0.8536 - val_loss: 1.3192 - val_accuracy: 0.6850
Epoch 177/200
43/43 [==============================] - 1s 12ms/step - loss: 0.5528 - accuracy: 0.8638 - val_loss: 1.2872 - val_accuracy: 0.6806
Epoch 178/200
43/43 [==============================] - 1s 12ms/step - loss: 0.5598 - accuracy: 0.8550 - val_loss: 1.2247 - val_accuracy: 0.6950
Epoch 179/200
43/43 [==============================] - 1s 12ms/step - loss: 0.5512 - accuracy: 0.8631 - val_loss: 1.2245 - val_accuracy: 0.6913
Epoch 180/200
43/43 [==============================] - 1s 12ms/step - loss: 0.5320 - accuracy: 0.8686 - val_loss: 1.2000 - val_accuracy: 0.6975
Epoch 181/200
43/43 [==============================] - 1s 12ms/step - loss: 0.5568 - accuracy: 0.8540 - val_loss: 1.1215 - val_accuracy: 0.7225
Epoch 182/200
43/43 [==============================] - 1s 12ms/step - loss: 0.5482 - accuracy: 0.8586 - val_loss: 1.2654 - val_accuracy: 0.6719
Epoch 183/200
43/43 [==============================] - 1s 12ms/step - loss: 0.5316 - accuracy: 0.8656 - val_loss: 1.1833 - val_accuracy: 0.6900
Epoch 184/200
43/43 [==============================] - 1s 12ms/step - loss: 0.5390 - accuracy: 0.8677 - val_loss: 1.1807 - val_accuracy: 0.7063
Epoch 185/200
43/43 [==============================] - 1s 12ms/step - loss: 0.5308 - accuracy: 0.8662 - val_loss: 1.1789 - val_accuracy: 0.7063
Epoch 186/200
43/43 [==============================] - 1s 12ms/step - loss: 0.5437 - accuracy: 0.8631 - val_loss: 1.2185 - val_accuracy: 0.6944
Epoch 187/200
43/43 [==============================] - 1s 12ms/step - loss: 0.5440 - accuracy: 0.8599 - val_loss: 1.1238 - val_accuracy: 0.7106
Epoch 188/200
43/43 [==============================] - 1s 13ms/step - loss: 0.5407 - accuracy: 0.8624 - val_loss: 1.1722 - val_accuracy: 0.7050
Epoch 189/200
43/43 [==============================] - 1s 12ms/step - loss: 0.5419 - accuracy: 0.8639 - val_loss: 1.2246 - val_accuracy: 0.6944
Epoch 190/200
43/43 [==============================] - 1s 12ms/step - loss: 0.5270 - accuracy: 0.8673 - val_loss: 1.1306 - val_accuracy: 0.7088
Epoch 191/200
43/43 [==============================] - 1s 12ms/step - loss: 0.5152 - accuracy: 0.8729 - val_loss: 1.2330 - val_accuracy: 0.7056
Epoch 192/200
43/43 [==============================] - 1s 12ms/step - loss: 0.5266 - accuracy: 0.8640 - val_loss: 1.2158 - val_accuracy: 0.7038
Epoch 193/200
43/43 [==============================] - 1s 12ms/step - loss: 0.5417 - accuracy: 0.8664 - val_loss: 1.1570 - val_accuracy: 0.7131
Epoch 194/200
43/43 [==============================] - 1s 12ms/step - loss: 0.5170 - accuracy: 0.8732 - val_loss: 1.2854 - val_accuracy: 0.6913
Epoch 195/200
43/43 [==============================] - 1s 12ms/step - loss: 0.5316 - accuracy: 0.8658 - val_loss: 1.1912 - val_accuracy: 0.7075
Epoch 196/200
43/43 [==============================] - 1s 12ms/step - loss: 0.5278 - accuracy: 0.8657 - val_loss: 1.1900 - val_accuracy: 0.7025
Epoch 197/200
43/43 [==============================] - 1s 12ms/step - loss: 0.5114 - accuracy: 0.8764 - val_loss: 1.2670 - val_accuracy: 0.6819
Epoch 198/200
43/43 [==============================] - 1s 13ms/step - loss: 0.5144 - accuracy: 0.8703 - val_loss: 1.1708 - val_accuracy: 0.7138
Epoch 199/200
43/43 [==============================] - 1s 12ms/step - loss: 0.5036 - accuracy: 0.8771 - val_loss: 1.1901 - val_accuracy: 0.7031
Epoch 200/200
43/43 [==============================] - 1s 13ms/step - loss: 0.4801 - accuracy: 0.8825 - val_loss: 1.1698 - val_accuracy: 0.7088
In [ ]:
_, accuracy = model_report(CNN1_MODEL_OPTIMIZED, CNN1_MODEL_OPTIMIZED_history)
accuracies_opt_200["CNN1"] = accuracy
Test set evaluation metrics
---------------------------
Loss:     1.208
Accuracy: 71.050%
CNN2
In [ ]:
CNN2_MODEL_OPTIMIZED = init_cnn2_model_optimized(summary = True)
CNN2_MODEL_OPTIMIZED_history = train_model(CNN2_MODEL_OPTIMIZED, epochs = 200, callbacks=[callback])
Model: "sequential_14"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
conv2d_26 (Conv2D)           (None, 32, 32, 32)        896       
_________________________________________________________________
batch_normalization_26 (Batc (None, 32, 32, 32)        128       
_________________________________________________________________
re_lu_26 (ReLU)              (None, 32, 32, 32)        0         
_________________________________________________________________
max_pooling2d_18 (MaxPooling (None, 16, 16, 32)        0         
_________________________________________________________________
dropout_37 (Dropout)         (None, 16, 16, 32)        0         
_________________________________________________________________
conv2d_27 (Conv2D)           (None, 16, 16, 64)        18496     
_________________________________________________________________
batch_normalization_27 (Batc (None, 16, 16, 64)        256       
_________________________________________________________________
re_lu_27 (ReLU)              (None, 16, 16, 64)        0         
_________________________________________________________________
max_pooling2d_19 (MaxPooling (None, 8, 8, 64)          0         
_________________________________________________________________
dropout_38 (Dropout)         (None, 8, 8, 64)          0         
_________________________________________________________________
conv2d_28 (Conv2D)           (None, 8, 8, 128)         73856     
_________________________________________________________________
batch_normalization_28 (Batc (None, 8, 8, 128)         512       
_________________________________________________________________
re_lu_28 (ReLU)              (None, 8, 8, 128)         0         
_________________________________________________________________
max_pooling2d_20 (MaxPooling (None, 4, 4, 128)         0         
_________________________________________________________________
dropout_39 (Dropout)         (None, 4, 4, 128)         0         
_________________________________________________________________
conv2d_29 (Conv2D)           (None, 4, 4, 256)         295168    
_________________________________________________________________
batch_normalization_29 (Batc (None, 4, 4, 256)         1024      
_________________________________________________________________
re_lu_29 (ReLU)              (None, 4, 4, 256)         0         
_________________________________________________________________
dropout_40 (Dropout)         (None, 4, 4, 256)         0         
_________________________________________________________________
flatten_8 (Flatten)          (None, 4096)              0         
_________________________________________________________________
dense_22 (Dense)             (None, 512)               2097664   
_________________________________________________________________
dropout_41 (Dropout)         (None, 512)               0         
_________________________________________________________________
dense_23 (Dense)             (None, 20)                10260     
=================================================================
Total params: 2,498,260
Trainable params: 2,497,300
Non-trainable params: 960
_________________________________________________________________
Epoch 1/200
43/43 [==============================] - 2s 19ms/step - loss: 6.1584 - accuracy: 0.0974 - val_loss: 5.9495 - val_accuracy: 0.0494
Epoch 2/200
43/43 [==============================] - 1s 14ms/step - loss: 5.5601 - accuracy: 0.1917 - val_loss: 6.0232 - val_accuracy: 0.0463
Epoch 3/200
43/43 [==============================] - 1s 14ms/step - loss: 5.3560 - accuracy: 0.2329 - val_loss: 6.1355 - val_accuracy: 0.0506
Epoch 4/200
43/43 [==============================] - 1s 14ms/step - loss: 5.1795 - accuracy: 0.2562 - val_loss: 6.2714 - val_accuracy: 0.0538
Epoch 5/200
43/43 [==============================] - 1s 14ms/step - loss: 4.9865 - accuracy: 0.2873 - val_loss: 6.3536 - val_accuracy: 0.0500
Epoch 6/200
43/43 [==============================] - 1s 14ms/step - loss: 4.8371 - accuracy: 0.3067 - val_loss: 6.2498 - val_accuracy: 0.0650
Epoch 7/200
43/43 [==============================] - 1s 15ms/step - loss: 4.6838 - accuracy: 0.3298 - val_loss: 6.3374 - val_accuracy: 0.0556
Epoch 8/200
43/43 [==============================] - 1s 15ms/step - loss: 4.5313 - accuracy: 0.3511 - val_loss: 6.1129 - val_accuracy: 0.0700
Epoch 9/200
43/43 [==============================] - 1s 14ms/step - loss: 4.4079 - accuracy: 0.3649 - val_loss: 5.8424 - val_accuracy: 0.1069
Epoch 10/200
43/43 [==============================] - 1s 14ms/step - loss: 4.2149 - accuracy: 0.4102 - val_loss: 5.5624 - val_accuracy: 0.1175
Epoch 11/200
43/43 [==============================] - 1s 15ms/step - loss: 4.1023 - accuracy: 0.4063 - val_loss: 5.0537 - val_accuracy: 0.1787
Epoch 12/200
43/43 [==============================] - 1s 14ms/step - loss: 3.9854 - accuracy: 0.4254 - val_loss: 4.7087 - val_accuracy: 0.2200
Epoch 13/200
43/43 [==============================] - 1s 15ms/step - loss: 3.8408 - accuracy: 0.4546 - val_loss: 4.4668 - val_accuracy: 0.2625
Epoch 14/200
43/43 [==============================] - 1s 15ms/step - loss: 3.7353 - accuracy: 0.4699 - val_loss: 4.1478 - val_accuracy: 0.3275
Epoch 15/200
43/43 [==============================] - 1s 15ms/step - loss: 3.6285 - accuracy: 0.4718 - val_loss: 3.9493 - val_accuracy: 0.3575
Epoch 16/200
43/43 [==============================] - 1s 14ms/step - loss: 3.5370 - accuracy: 0.4783 - val_loss: 3.8758 - val_accuracy: 0.3738
Epoch 17/200
43/43 [==============================] - 1s 14ms/step - loss: 3.4247 - accuracy: 0.5027 - val_loss: 3.6230 - val_accuracy: 0.4369
Epoch 18/200
43/43 [==============================] - 1s 14ms/step - loss: 3.3482 - accuracy: 0.5048 - val_loss: 3.6436 - val_accuracy: 0.4125
Epoch 19/200
43/43 [==============================] - 1s 14ms/step - loss: 3.2512 - accuracy: 0.5308 - val_loss: 3.4272 - val_accuracy: 0.4750
Epoch 20/200
43/43 [==============================] - 1s 14ms/step - loss: 3.1683 - accuracy: 0.5333 - val_loss: 3.3958 - val_accuracy: 0.4538
Epoch 21/200
43/43 [==============================] - 1s 15ms/step - loss: 3.0955 - accuracy: 0.5358 - val_loss: 3.6172 - val_accuracy: 0.4000
Epoch 22/200
43/43 [==============================] - 1s 14ms/step - loss: 2.9827 - accuracy: 0.5611 - val_loss: 3.5210 - val_accuracy: 0.4256
Epoch 23/200
43/43 [==============================] - 1s 15ms/step - loss: 2.9823 - accuracy: 0.5406 - val_loss: 3.5305 - val_accuracy: 0.4150
Epoch 24/200
43/43 [==============================] - 1s 15ms/step - loss: 2.8494 - accuracy: 0.5712 - val_loss: 3.3306 - val_accuracy: 0.4394
Epoch 25/200
43/43 [==============================] - 1s 14ms/step - loss: 2.7926 - accuracy: 0.5688 - val_loss: 3.2228 - val_accuracy: 0.4625
Epoch 26/200
43/43 [==============================] - 1s 14ms/step - loss: 2.7285 - accuracy: 0.5821 - val_loss: 3.4378 - val_accuracy: 0.4119
Epoch 27/200
43/43 [==============================] - 1s 14ms/step - loss: 2.6364 - accuracy: 0.5994 - val_loss: 3.1713 - val_accuracy: 0.4650
Epoch 28/200
43/43 [==============================] - 1s 14ms/step - loss: 2.6220 - accuracy: 0.5880 - val_loss: 3.0350 - val_accuracy: 0.4863
Epoch 29/200
43/43 [==============================] - 1s 15ms/step - loss: 2.5369 - accuracy: 0.6045 - val_loss: 3.0329 - val_accuracy: 0.4731
Epoch 30/200
43/43 [==============================] - 1s 14ms/step - loss: 2.5022 - accuracy: 0.6008 - val_loss: 3.0527 - val_accuracy: 0.4681
Epoch 31/200
43/43 [==============================] - 1s 14ms/step - loss: 2.4556 - accuracy: 0.6130 - val_loss: 2.8822 - val_accuracy: 0.4956
Epoch 32/200
43/43 [==============================] - 1s 15ms/step - loss: 2.3784 - accuracy: 0.6217 - val_loss: 3.0970 - val_accuracy: 0.4506
Epoch 33/200
43/43 [==============================] - 1s 15ms/step - loss: 2.3203 - accuracy: 0.6290 - val_loss: 3.0371 - val_accuracy: 0.4581
Epoch 34/200
43/43 [==============================] - 1s 15ms/step - loss: 2.2754 - accuracy: 0.6386 - val_loss: 3.0296 - val_accuracy: 0.4531
Epoch 35/200
43/43 [==============================] - 1s 15ms/step - loss: 2.2205 - accuracy: 0.6450 - val_loss: 3.0578 - val_accuracy: 0.4506
Epoch 36/200
43/43 [==============================] - 1s 14ms/step - loss: 2.1738 - accuracy: 0.6530 - val_loss: 2.7128 - val_accuracy: 0.5131
Epoch 37/200
43/43 [==============================] - 1s 15ms/step - loss: 2.1121 - accuracy: 0.6577 - val_loss: 2.6645 - val_accuracy: 0.5256
Epoch 38/200
43/43 [==============================] - 1s 14ms/step - loss: 2.0988 - accuracy: 0.6590 - val_loss: 3.0653 - val_accuracy: 0.4581
Epoch 39/200
43/43 [==============================] - 1s 14ms/step - loss: 2.0250 - accuracy: 0.6683 - val_loss: 2.6862 - val_accuracy: 0.5025
Epoch 40/200
43/43 [==============================] - 1s 14ms/step - loss: 1.9776 - accuracy: 0.6788 - val_loss: 2.4962 - val_accuracy: 0.5419
Epoch 41/200
43/43 [==============================] - 1s 15ms/step - loss: 1.9488 - accuracy: 0.6819 - val_loss: 2.6309 - val_accuracy: 0.5056
Epoch 42/200
43/43 [==============================] - 1s 15ms/step - loss: 1.8798 - accuracy: 0.6915 - val_loss: 2.6670 - val_accuracy: 0.5013
Epoch 43/200
43/43 [==============================] - 1s 14ms/step - loss: 1.8476 - accuracy: 0.7000 - val_loss: 2.5622 - val_accuracy: 0.5275
Epoch 44/200
43/43 [==============================] - 1s 15ms/step - loss: 1.8344 - accuracy: 0.6924 - val_loss: 2.8638 - val_accuracy: 0.4600
Epoch 45/200
43/43 [==============================] - 1s 15ms/step - loss: 1.7844 - accuracy: 0.7086 - val_loss: 2.5082 - val_accuracy: 0.5294
Epoch 46/200
43/43 [==============================] - 1s 15ms/step - loss: 1.7122 - accuracy: 0.7127 - val_loss: 2.4862 - val_accuracy: 0.5319
Epoch 47/200
43/43 [==============================] - 1s 15ms/step - loss: 1.7353 - accuracy: 0.7072 - val_loss: 2.4851 - val_accuracy: 0.5288
Epoch 48/200
43/43 [==============================] - 1s 15ms/step - loss: 1.6574 - accuracy: 0.7216 - val_loss: 2.5378 - val_accuracy: 0.5100
Epoch 49/200
43/43 [==============================] - 1s 15ms/step - loss: 1.6310 - accuracy: 0.7257 - val_loss: 2.3225 - val_accuracy: 0.5500
Epoch 50/200
43/43 [==============================] - 1s 15ms/step - loss: 1.5895 - accuracy: 0.7315 - val_loss: 2.5106 - val_accuracy: 0.5169
Epoch 51/200
43/43 [==============================] - 1s 14ms/step - loss: 1.5734 - accuracy: 0.7351 - val_loss: 2.5581 - val_accuracy: 0.5031
Epoch 52/200
43/43 [==============================] - 1s 15ms/step - loss: 1.5274 - accuracy: 0.7407 - val_loss: 2.3482 - val_accuracy: 0.5350
Epoch 53/200
43/43 [==============================] - 1s 15ms/step - loss: 1.5109 - accuracy: 0.7439 - val_loss: 2.3497 - val_accuracy: 0.5450
Epoch 54/200
43/43 [==============================] - 1s 14ms/step - loss: 1.4905 - accuracy: 0.7427 - val_loss: 2.2073 - val_accuracy: 0.5750
Epoch 55/200
43/43 [==============================] - 1s 15ms/step - loss: 1.4380 - accuracy: 0.7522 - val_loss: 2.3961 - val_accuracy: 0.5331
Epoch 56/200
43/43 [==============================] - 1s 14ms/step - loss: 1.4381 - accuracy: 0.7525 - val_loss: 2.2699 - val_accuracy: 0.5544
Epoch 57/200
43/43 [==============================] - 1s 15ms/step - loss: 1.3909 - accuracy: 0.7570 - val_loss: 2.0448 - val_accuracy: 0.5888
Epoch 58/200
43/43 [==============================] - 1s 14ms/step - loss: 1.3602 - accuracy: 0.7635 - val_loss: 2.0577 - val_accuracy: 0.5869
Epoch 59/200
43/43 [==============================] - 1s 15ms/step - loss: 1.3447 - accuracy: 0.7746 - val_loss: 2.0750 - val_accuracy: 0.5844
Epoch 60/200
43/43 [==============================] - 1s 15ms/step - loss: 1.2991 - accuracy: 0.7786 - val_loss: 2.1730 - val_accuracy: 0.5650
Epoch 61/200
43/43 [==============================] - 1s 15ms/step - loss: 1.3056 - accuracy: 0.7755 - val_loss: 2.0890 - val_accuracy: 0.5825
Epoch 62/200
43/43 [==============================] - 1s 15ms/step - loss: 1.2638 - accuracy: 0.7886 - val_loss: 1.9569 - val_accuracy: 0.6031
Epoch 63/200
43/43 [==============================] - 1s 15ms/step - loss: 1.2378 - accuracy: 0.7869 - val_loss: 2.1440 - val_accuracy: 0.5575
Epoch 64/200
43/43 [==============================] - 1s 15ms/step - loss: 1.2192 - accuracy: 0.7893 - val_loss: 1.9461 - val_accuracy: 0.6025
Epoch 65/200
43/43 [==============================] - 1s 15ms/step - loss: 1.1762 - accuracy: 0.8035 - val_loss: 1.9851 - val_accuracy: 0.5987
Epoch 66/200
43/43 [==============================] - 1s 14ms/step - loss: 1.1759 - accuracy: 0.7979 - val_loss: 1.8906 - val_accuracy: 0.6169
Epoch 67/200
43/43 [==============================] - 1s 14ms/step - loss: 1.1368 - accuracy: 0.8030 - val_loss: 1.9747 - val_accuracy: 0.6031
Epoch 68/200
43/43 [==============================] - 1s 15ms/step - loss: 1.1044 - accuracy: 0.8181 - val_loss: 2.0819 - val_accuracy: 0.5738
Epoch 69/200
43/43 [==============================] - 1s 15ms/step - loss: 1.1097 - accuracy: 0.8029 - val_loss: 1.8810 - val_accuracy: 0.6162
Epoch 70/200
43/43 [==============================] - 1s 15ms/step - loss: 1.0836 - accuracy: 0.8130 - val_loss: 1.8996 - val_accuracy: 0.6106
Epoch 71/200
43/43 [==============================] - 1s 15ms/step - loss: 1.0515 - accuracy: 0.8201 - val_loss: 1.9598 - val_accuracy: 0.6037
Epoch 72/200
43/43 [==============================] - 1s 14ms/step - loss: 1.0413 - accuracy: 0.8181 - val_loss: 2.1950 - val_accuracy: 0.5594
Epoch 73/200
43/43 [==============================] - 1s 14ms/step - loss: 1.0137 - accuracy: 0.8287 - val_loss: 1.9270 - val_accuracy: 0.6012
Epoch 74/200
43/43 [==============================] - 1s 15ms/step - loss: 0.9968 - accuracy: 0.8256 - val_loss: 1.7079 - val_accuracy: 0.6375
Epoch 75/200
43/43 [==============================] - 1s 14ms/step - loss: 0.9725 - accuracy: 0.8405 - val_loss: 1.9827 - val_accuracy: 0.5969
Epoch 76/200
43/43 [==============================] - 1s 15ms/step - loss: 0.9418 - accuracy: 0.8420 - val_loss: 1.9045 - val_accuracy: 0.5975
Epoch 77/200
43/43 [==============================] - 1s 15ms/step - loss: 0.9510 - accuracy: 0.8346 - val_loss: 1.9224 - val_accuracy: 0.6075
Epoch 78/200
43/43 [==============================] - 1s 15ms/step - loss: 0.9075 - accuracy: 0.8498 - val_loss: 1.7778 - val_accuracy: 0.6244
Epoch 79/200
43/43 [==============================] - 1s 15ms/step - loss: 0.9088 - accuracy: 0.8428 - val_loss: 1.7582 - val_accuracy: 0.6225
Epoch 80/200
43/43 [==============================] - 1s 14ms/step - loss: 0.8879 - accuracy: 0.8519 - val_loss: 1.8157 - val_accuracy: 0.6169
Epoch 81/200
43/43 [==============================] - 1s 15ms/step - loss: 0.9082 - accuracy: 0.8425 - val_loss: 1.9414 - val_accuracy: 0.5831
Epoch 82/200
43/43 [==============================] - 1s 15ms/step - loss: 0.8511 - accuracy: 0.8585 - val_loss: 1.7766 - val_accuracy: 0.6275
Epoch 83/200
43/43 [==============================] - 1s 15ms/step - loss: 0.8326 - accuracy: 0.8643 - val_loss: 1.7936 - val_accuracy: 0.6181
Epoch 84/200
43/43 [==============================] - 1s 15ms/step - loss: 0.8089 - accuracy: 0.8706 - val_loss: 1.8801 - val_accuracy: 0.6056
Epoch 85/200
43/43 [==============================] - 1s 14ms/step - loss: 0.7954 - accuracy: 0.8671 - val_loss: 1.7567 - val_accuracy: 0.6275
Epoch 86/200
43/43 [==============================] - 1s 15ms/step - loss: 0.7923 - accuracy: 0.8723 - val_loss: 1.7851 - val_accuracy: 0.6175
Epoch 87/200
43/43 [==============================] - 1s 15ms/step - loss: 0.7788 - accuracy: 0.8678 - val_loss: 1.7634 - val_accuracy: 0.6363
Epoch 88/200
43/43 [==============================] - 1s 15ms/step - loss: 0.7533 - accuracy: 0.8849 - val_loss: 1.6787 - val_accuracy: 0.6381
Epoch 89/200
43/43 [==============================] - 1s 15ms/step - loss: 0.7491 - accuracy: 0.8787 - val_loss: 1.6907 - val_accuracy: 0.6344
Epoch 90/200
43/43 [==============================] - 1s 15ms/step - loss: 0.7355 - accuracy: 0.8843 - val_loss: 1.8236 - val_accuracy: 0.6100
Epoch 91/200
43/43 [==============================] - 1s 14ms/step - loss: 0.7191 - accuracy: 0.8877 - val_loss: 1.7042 - val_accuracy: 0.6444
Epoch 92/200
43/43 [==============================] - 1s 15ms/step - loss: 0.7313 - accuracy: 0.8750 - val_loss: 1.7457 - val_accuracy: 0.6275
Epoch 93/200
43/43 [==============================] - 1s 15ms/step - loss: 0.7087 - accuracy: 0.8835 - val_loss: 1.8041 - val_accuracy: 0.6212
Epoch 94/200
43/43 [==============================] - 1s 15ms/step - loss: 0.7038 - accuracy: 0.8886 - val_loss: 1.7821 - val_accuracy: 0.6313
Epoch 95/200
43/43 [==============================] - 1s 15ms/step - loss: 0.6829 - accuracy: 0.8892 - val_loss: 1.6119 - val_accuracy: 0.6494
Epoch 96/200
43/43 [==============================] - 1s 15ms/step - loss: 0.6783 - accuracy: 0.8915 - val_loss: 1.6425 - val_accuracy: 0.6525
Epoch 97/200
43/43 [==============================] - 1s 15ms/step - loss: 0.6623 - accuracy: 0.8926 - val_loss: 1.7045 - val_accuracy: 0.6225
Epoch 98/200
43/43 [==============================] - 1s 15ms/step - loss: 0.6433 - accuracy: 0.9010 - val_loss: 1.8196 - val_accuracy: 0.6162
Epoch 99/200
43/43 [==============================] - 1s 15ms/step - loss: 0.6489 - accuracy: 0.8962 - val_loss: 1.5529 - val_accuracy: 0.6513
Epoch 100/200
43/43 [==============================] - 1s 15ms/step - loss: 0.6218 - accuracy: 0.8993 - val_loss: 1.5939 - val_accuracy: 0.6587
Epoch 101/200
43/43 [==============================] - 1s 15ms/step - loss: 0.6282 - accuracy: 0.8957 - val_loss: 1.7610 - val_accuracy: 0.6288
Epoch 102/200
43/43 [==============================] - 1s 15ms/step - loss: 0.6161 - accuracy: 0.8995 - val_loss: 1.6126 - val_accuracy: 0.6525
Epoch 103/200
43/43 [==============================] - 1s 15ms/step - loss: 0.5896 - accuracy: 0.9095 - val_loss: 1.6849 - val_accuracy: 0.6419
Epoch 104/200
43/43 [==============================] - 1s 15ms/step - loss: 0.5849 - accuracy: 0.9141 - val_loss: 1.7674 - val_accuracy: 0.6187
Epoch 105/200
43/43 [==============================] - 1s 15ms/step - loss: 0.5953 - accuracy: 0.9065 - val_loss: 1.6778 - val_accuracy: 0.6444
Epoch 106/200
43/43 [==============================] - 1s 15ms/step - loss: 0.5838 - accuracy: 0.9054 - val_loss: 1.8508 - val_accuracy: 0.6125
Epoch 107/200
43/43 [==============================] - 1s 14ms/step - loss: 0.5743 - accuracy: 0.9070 - val_loss: 1.7109 - val_accuracy: 0.6356
Epoch 108/200
43/43 [==============================] - 1s 15ms/step - loss: 0.5577 - accuracy: 0.9144 - val_loss: 1.5920 - val_accuracy: 0.6519
Epoch 109/200
43/43 [==============================] - 1s 15ms/step - loss: 0.5517 - accuracy: 0.9138 - val_loss: 1.7179 - val_accuracy: 0.6406
Epoch 110/200
43/43 [==============================] - 1s 15ms/step - loss: 0.5430 - accuracy: 0.9146 - val_loss: 1.7336 - val_accuracy: 0.6331
Epoch 111/200
43/43 [==============================] - 1s 15ms/step - loss: 0.5454 - accuracy: 0.9133 - val_loss: 1.7358 - val_accuracy: 0.6263
Epoch 112/200
43/43 [==============================] - 1s 14ms/step - loss: 0.5381 - accuracy: 0.9138 - val_loss: 1.6781 - val_accuracy: 0.6356
Epoch 113/200
43/43 [==============================] - 1s 14ms/step - loss: 0.5200 - accuracy: 0.9197 - val_loss: 1.7786 - val_accuracy: 0.6331
Epoch 114/200
43/43 [==============================] - 1s 15ms/step - loss: 0.5137 - accuracy: 0.9186 - val_loss: 1.6590 - val_accuracy: 0.6325
Epoch 115/200
43/43 [==============================] - 1s 15ms/step - loss: 0.5111 - accuracy: 0.9192 - val_loss: 1.7923 - val_accuracy: 0.6194
Epoch 116/200
43/43 [==============================] - 1s 15ms/step - loss: 0.5050 - accuracy: 0.9192 - val_loss: 1.4707 - val_accuracy: 0.6712
Epoch 117/200
43/43 [==============================] - 1s 15ms/step - loss: 0.4905 - accuracy: 0.9266 - val_loss: 1.5914 - val_accuracy: 0.6500
Epoch 118/200
43/43 [==============================] - 1s 15ms/step - loss: 0.4822 - accuracy: 0.9269 - val_loss: 1.5868 - val_accuracy: 0.6606
Epoch 119/200
43/43 [==============================] - 1s 15ms/step - loss: 0.4628 - accuracy: 0.9361 - val_loss: 1.6972 - val_accuracy: 0.6394
Epoch 120/200
43/43 [==============================] - 1s 14ms/step - loss: 0.4638 - accuracy: 0.9285 - val_loss: 1.6905 - val_accuracy: 0.6425
Epoch 121/200
43/43 [==============================] - 1s 15ms/step - loss: 0.4834 - accuracy: 0.9258 - val_loss: 1.4744 - val_accuracy: 0.6675
Epoch 122/200
43/43 [==============================] - 1s 15ms/step - loss: 0.4546 - accuracy: 0.9290 - val_loss: 1.7308 - val_accuracy: 0.6400
Epoch 123/200
43/43 [==============================] - 1s 15ms/step - loss: 0.4474 - accuracy: 0.9343 - val_loss: 1.5782 - val_accuracy: 0.6513
Epoch 124/200
43/43 [==============================] - 1s 15ms/step - loss: 0.4513 - accuracy: 0.9374 - val_loss: 1.7735 - val_accuracy: 0.6306
Epoch 125/200
43/43 [==============================] - 1s 14ms/step - loss: 0.4514 - accuracy: 0.9279 - val_loss: 1.4820 - val_accuracy: 0.6694
Epoch 126/200
43/43 [==============================] - 1s 14ms/step - loss: 0.4425 - accuracy: 0.9313 - val_loss: 1.6127 - val_accuracy: 0.6506
Epoch 127/200
43/43 [==============================] - 1s 14ms/step - loss: 0.4281 - accuracy: 0.9383 - val_loss: 1.6881 - val_accuracy: 0.6513
Epoch 128/200
43/43 [==============================] - 1s 15ms/step - loss: 0.4268 - accuracy: 0.9388 - val_loss: 1.5796 - val_accuracy: 0.6581
Epoch 129/200
43/43 [==============================] - 1s 15ms/step - loss: 0.4255 - accuracy: 0.9394 - val_loss: 1.6480 - val_accuracy: 0.6425
Epoch 130/200
43/43 [==============================] - 1s 15ms/step - loss: 0.4219 - accuracy: 0.9370 - val_loss: 1.4784 - val_accuracy: 0.6831
Epoch 131/200
43/43 [==============================] - 1s 15ms/step - loss: 0.4212 - accuracy: 0.9343 - val_loss: 1.5742 - val_accuracy: 0.6562
Epoch 132/200
43/43 [==============================] - 1s 15ms/step - loss: 0.4130 - accuracy: 0.9403 - val_loss: 1.5955 - val_accuracy: 0.6544
Epoch 133/200
43/43 [==============================] - 1s 15ms/step - loss: 0.4019 - accuracy: 0.9427 - val_loss: 1.4886 - val_accuracy: 0.6725
Epoch 134/200
43/43 [==============================] - 1s 15ms/step - loss: 0.4004 - accuracy: 0.9424 - val_loss: 1.4404 - val_accuracy: 0.6787
Epoch 135/200
43/43 [==============================] - 1s 15ms/step - loss: 0.3863 - accuracy: 0.9465 - val_loss: 1.6877 - val_accuracy: 0.6369
Epoch 136/200
43/43 [==============================] - 1s 15ms/step - loss: 0.4032 - accuracy: 0.9346 - val_loss: 1.4242 - val_accuracy: 0.6794
Epoch 137/200
43/43 [==============================] - 1s 15ms/step - loss: 0.3839 - accuracy: 0.9470 - val_loss: 1.4849 - val_accuracy: 0.6731
Epoch 138/200
43/43 [==============================] - 1s 15ms/step - loss: 0.3900 - accuracy: 0.9420 - val_loss: 1.6140 - val_accuracy: 0.6637
Epoch 139/200
43/43 [==============================] - 1s 15ms/step - loss: 0.3799 - accuracy: 0.9439 - val_loss: 1.7567 - val_accuracy: 0.6344
Epoch 140/200
43/43 [==============================] - 1s 15ms/step - loss: 0.3796 - accuracy: 0.9428 - val_loss: 1.7196 - val_accuracy: 0.6388
Epoch 141/200
43/43 [==============================] - 1s 15ms/step - loss: 0.3735 - accuracy: 0.9448 - val_loss: 1.8176 - val_accuracy: 0.6225
Epoch 142/200
43/43 [==============================] - 1s 14ms/step - loss: 0.3669 - accuracy: 0.9433 - val_loss: 1.5948 - val_accuracy: 0.6569
Epoch 143/200
43/43 [==============================] - 1s 15ms/step - loss: 0.3798 - accuracy: 0.9435 - val_loss: 1.5609 - val_accuracy: 0.6644
Epoch 144/200
43/43 [==============================] - 1s 15ms/step - loss: 0.3675 - accuracy: 0.9437 - val_loss: 1.5632 - val_accuracy: 0.6575
Epoch 145/200
43/43 [==============================] - 1s 15ms/step - loss: 0.3607 - accuracy: 0.9493 - val_loss: 1.4728 - val_accuracy: 0.6819
Epoch 146/200
43/43 [==============================] - 1s 15ms/step - loss: 0.3660 - accuracy: 0.9455 - val_loss: 1.5548 - val_accuracy: 0.6625
Epoch 147/200
43/43 [==============================] - 1s 15ms/step - loss: 0.3532 - accuracy: 0.9484 - val_loss: 1.4733 - val_accuracy: 0.6656
Epoch 148/200
43/43 [==============================] - 1s 15ms/step - loss: 0.3460 - accuracy: 0.9483 - val_loss: 1.4450 - val_accuracy: 0.6850
Epoch 149/200
43/43 [==============================] - 1s 15ms/step - loss: 0.3478 - accuracy: 0.9483 - val_loss: 1.5413 - val_accuracy: 0.6606
Epoch 150/200
43/43 [==============================] - 1s 15ms/step - loss: 0.3301 - accuracy: 0.9560 - val_loss: 1.4185 - val_accuracy: 0.6869
Epoch 151/200
43/43 [==============================] - 1s 15ms/step - loss: 0.3336 - accuracy: 0.9529 - val_loss: 1.4926 - val_accuracy: 0.6812
Epoch 152/200
43/43 [==============================] - 1s 15ms/step - loss: 0.3359 - accuracy: 0.9524 - val_loss: 1.3831 - val_accuracy: 0.6775
Epoch 153/200
43/43 [==============================] - 1s 14ms/step - loss: 0.3440 - accuracy: 0.9494 - val_loss: 1.5351 - val_accuracy: 0.6681
Epoch 154/200
43/43 [==============================] - 1s 15ms/step - loss: 0.3217 - accuracy: 0.9542 - val_loss: 1.5109 - val_accuracy: 0.6837
Epoch 155/200
43/43 [==============================] - 1s 15ms/step - loss: 0.3207 - accuracy: 0.9539 - val_loss: 1.7122 - val_accuracy: 0.6388
Epoch 156/200
43/43 [==============================] - 1s 15ms/step - loss: 0.3273 - accuracy: 0.9545 - val_loss: 1.4596 - val_accuracy: 0.6875
Epoch 157/200
43/43 [==============================] - 1s 15ms/step - loss: 0.3160 - accuracy: 0.9559 - val_loss: 1.5138 - val_accuracy: 0.6831
Epoch 158/200
43/43 [==============================] - 1s 15ms/step - loss: 0.3105 - accuracy: 0.9572 - val_loss: 1.4265 - val_accuracy: 0.6881
Epoch 159/200
43/43 [==============================] - 1s 15ms/step - loss: 0.3124 - accuracy: 0.9550 - val_loss: 1.6010 - val_accuracy: 0.6756
Epoch 160/200
43/43 [==============================] - 1s 15ms/step - loss: 0.3201 - accuracy: 0.9523 - val_loss: 1.6411 - val_accuracy: 0.6562
Epoch 161/200
43/43 [==============================] - 1s 15ms/step - loss: 0.3134 - accuracy: 0.9576 - val_loss: 1.4627 - val_accuracy: 0.6719
Epoch 162/200
43/43 [==============================] - 1s 15ms/step - loss: 0.2963 - accuracy: 0.9602 - val_loss: 1.5653 - val_accuracy: 0.6706
Epoch 163/200
43/43 [==============================] - 1s 15ms/step - loss: 0.2898 - accuracy: 0.9603 - val_loss: 1.5164 - val_accuracy: 0.6787
Epoch 164/200
43/43 [==============================] - 1s 15ms/step - loss: 0.3016 - accuracy: 0.9556 - val_loss: 1.4627 - val_accuracy: 0.6831
Epoch 165/200
43/43 [==============================] - 1s 15ms/step - loss: 0.2935 - accuracy: 0.9609 - val_loss: 1.4647 - val_accuracy: 0.6850
Epoch 166/200
43/43 [==============================] - 1s 15ms/step - loss: 0.3007 - accuracy: 0.9571 - val_loss: 1.4483 - val_accuracy: 0.6812
Epoch 167/200
43/43 [==============================] - 1s 15ms/step - loss: 0.2917 - accuracy: 0.9596 - val_loss: 1.6354 - val_accuracy: 0.6637
Epoch 168/200
43/43 [==============================] - 1s 15ms/step - loss: 0.2864 - accuracy: 0.9599 - val_loss: 1.4450 - val_accuracy: 0.6925
Epoch 169/200
43/43 [==============================] - 1s 15ms/step - loss: 0.2778 - accuracy: 0.9654 - val_loss: 1.9209 - val_accuracy: 0.6162
Epoch 170/200
43/43 [==============================] - 1s 15ms/step - loss: 0.2854 - accuracy: 0.9613 - val_loss: 1.4862 - val_accuracy: 0.6831
Epoch 171/200
43/43 [==============================] - 1s 15ms/step - loss: 0.2816 - accuracy: 0.9628 - val_loss: 1.5371 - val_accuracy: 0.6687
Epoch 172/200
43/43 [==============================] - 1s 14ms/step - loss: 0.2811 - accuracy: 0.9618 - val_loss: 1.5031 - val_accuracy: 0.6712
In [ ]:
_, accuracy = model_report(CNN2_MODEL_OPTIMIZED, CNN2_MODEL_OPTIMIZED_history)
accuracies_opt_200["CNN2"] = accuracy
Test set evaluation metrics
---------------------------
Loss:     1.352
Accuracy: 70.600%

Μεταφορά μάθησης

VGG16
In [ ]:
VGG16_MODEL_OPTIMIZED = init_VGG16_model_optimized(True)
VGG16_MODEL_OPTIMIZED_history = train_model(VGG16_MODEL_OPTIMIZED, epochs = 200, callbacks = [callback])
Model: "sequential_15"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
vgg16 (Functional)           (None, 1, 1, 512)         14714688  
_________________________________________________________________
dropout_42 (Dropout)         (None, 1, 1, 512)         0         
_________________________________________________________________
global_average_pooling2d_6 ( (None, 512)               0         
_________________________________________________________________
dense_24 (Dense)             (None, 20)                10260     
=================================================================
Total params: 14,724,948
Trainable params: 14,724,948
Non-trainable params: 0
_________________________________________________________________
Epoch 1/200
43/43 [==============================] - 4s 54ms/step - loss: 2.7066 - accuracy: 0.1956 - val_loss: 1.4306 - val_accuracy: 0.5788
Epoch 2/200
43/43 [==============================] - 2s 50ms/step - loss: 1.5455 - accuracy: 0.5438 - val_loss: 1.0890 - val_accuracy: 0.6681
Epoch 3/200
43/43 [==============================] - 2s 50ms/step - loss: 1.0942 - accuracy: 0.6802 - val_loss: 0.9533 - val_accuracy: 0.7219
Epoch 4/200
43/43 [==============================] - 2s 51ms/step - loss: 0.8603 - accuracy: 0.7399 - val_loss: 0.9170 - val_accuracy: 0.7287
Epoch 5/200
43/43 [==============================] - 2s 50ms/step - loss: 0.6490 - accuracy: 0.8052 - val_loss: 0.9382 - val_accuracy: 0.7375
Epoch 6/200
43/43 [==============================] - 2s 50ms/step - loss: 0.5049 - accuracy: 0.8497 - val_loss: 0.8861 - val_accuracy: 0.7506
Epoch 7/200
43/43 [==============================] - 2s 50ms/step - loss: 0.3944 - accuracy: 0.8815 - val_loss: 0.9117 - val_accuracy: 0.7600
Epoch 8/200
43/43 [==============================] - 2s 50ms/step - loss: 0.2956 - accuracy: 0.9087 - val_loss: 0.9149 - val_accuracy: 0.7781
Epoch 9/200
43/43 [==============================] - 2s 50ms/step - loss: 0.2093 - accuracy: 0.9360 - val_loss: 0.9361 - val_accuracy: 0.7656
Epoch 10/200
43/43 [==============================] - 2s 50ms/step - loss: 0.1434 - accuracy: 0.9561 - val_loss: 1.0615 - val_accuracy: 0.7625
Epoch 11/200
43/43 [==============================] - 2s 50ms/step - loss: 0.1131 - accuracy: 0.9660 - val_loss: 1.1128 - val_accuracy: 0.7713
Epoch 12/200
43/43 [==============================] - 2s 50ms/step - loss: 0.0828 - accuracy: 0.9743 - val_loss: 1.1053 - val_accuracy: 0.7781
Epoch 13/200
43/43 [==============================] - 2s 50ms/step - loss: 0.0504 - accuracy: 0.9840 - val_loss: 1.2500 - val_accuracy: 0.7644
Epoch 14/200
43/43 [==============================] - 2s 50ms/step - loss: 0.0747 - accuracy: 0.9758 - val_loss: 1.2643 - val_accuracy: 0.7575
Epoch 15/200
43/43 [==============================] - 2s 50ms/step - loss: 0.0929 - accuracy: 0.9686 - val_loss: 1.1363 - val_accuracy: 0.7719
Epoch 16/200
43/43 [==============================] - 2s 51ms/step - loss: 0.0686 - accuracy: 0.9806 - val_loss: 1.2975 - val_accuracy: 0.7600
Epoch 17/200
43/43 [==============================] - 2s 50ms/step - loss: 0.0541 - accuracy: 0.9806 - val_loss: 1.2461 - val_accuracy: 0.7731
Epoch 18/200
43/43 [==============================] - 2s 50ms/step - loss: 0.0399 - accuracy: 0.9890 - val_loss: 1.2692 - val_accuracy: 0.7763
Epoch 19/200
43/43 [==============================] - 2s 50ms/step - loss: 0.0239 - accuracy: 0.9932 - val_loss: 1.4204 - val_accuracy: 0.7638
Epoch 20/200
43/43 [==============================] - 2s 50ms/step - loss: 0.0223 - accuracy: 0.9934 - val_loss: 1.3129 - val_accuracy: 0.7781
Epoch 21/200
43/43 [==============================] - 2s 50ms/step - loss: 0.0206 - accuracy: 0.9954 - val_loss: 1.2817 - val_accuracy: 0.7806
Epoch 22/200
43/43 [==============================] - 2s 50ms/step - loss: 0.0119 - accuracy: 0.9967 - val_loss: 1.4749 - val_accuracy: 0.7625
Epoch 23/200
43/43 [==============================] - 2s 50ms/step - loss: 0.0238 - accuracy: 0.9939 - val_loss: 1.2947 - val_accuracy: 0.7731
Epoch 24/200
43/43 [==============================] - 2s 50ms/step - loss: 0.0196 - accuracy: 0.9950 - val_loss: 1.4430 - val_accuracy: 0.7544
Epoch 25/200
43/43 [==============================] - 2s 50ms/step - loss: 0.0295 - accuracy: 0.9902 - val_loss: 1.3528 - val_accuracy: 0.7588
Epoch 26/200
43/43 [==============================] - 2s 50ms/step - loss: 0.0360 - accuracy: 0.9893 - val_loss: 1.2158 - val_accuracy: 0.7844
In [ ]:
_, accuracy = model_report(VGG16_MODEL_OPTIMIZED, VGG16_MODEL_OPTIMIZED_history)
accuracies_opt_200["VGG_ALL"] = accuracy
Test set evaluation metrics
---------------------------
Loss:     0.854
Accuracy: 75.900%
MobileNet
In [ ]:
MobileNetV2_MODEL_OPTIMIZED = init_MobileNetV2_model_optimized(True)
MobileNetV2_MODEL_OPTIMIZED_history = train_model(MobileNetV2_MODEL_OPTIMIZED, train_dataset = train_ds_res, validation_dataset = validation_ds_res, epochs = 200, callbacks=[callback])
Model: "sequential_16"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
mobilenetv2_1.00_224 (Functi (None, 7, 7, 1280)        2257984   
_________________________________________________________________
dropout_43 (Dropout)         (None, 7, 7, 1280)        0         
_________________________________________________________________
global_average_pooling2d_7 ( (None, 1280)              0         
_________________________________________________________________
dense_25 (Dense)             (None, 20)                25620     
=================================================================
Total params: 2,283,604
Trainable params: 2,249,492
Non-trainable params: 34,112
_________________________________________________________________
Epoch 1/200
43/43 [==============================] - 51s 1s/step - loss: 2.3965 - accuracy: 0.3121 - val_loss: 2.4906 - val_accuracy: 0.3731
Epoch 2/200
43/43 [==============================] - 45s 1s/step - loss: 0.5952 - accuracy: 0.8327 - val_loss: 2.0099 - val_accuracy: 0.4456
Epoch 3/200
43/43 [==============================] - 45s 1s/step - loss: 0.2737 - accuracy: 0.9330 - val_loss: 1.9899 - val_accuracy: 0.4581
Epoch 4/200
43/43 [==============================] - 45s 1s/step - loss: 0.1367 - accuracy: 0.9775 - val_loss: 1.9278 - val_accuracy: 0.4706
Epoch 5/200
43/43 [==============================] - 45s 1s/step - loss: 0.0661 - accuracy: 0.9958 - val_loss: 1.9408 - val_accuracy: 0.4769
Epoch 6/200
43/43 [==============================] - 45s 1s/step - loss: 0.0390 - accuracy: 0.9989 - val_loss: 1.9676 - val_accuracy: 0.4700
Epoch 7/200
43/43 [==============================] - 45s 1s/step - loss: 0.0238 - accuracy: 0.9999 - val_loss: 2.0341 - val_accuracy: 0.4663
Epoch 8/200
43/43 [==============================] - 45s 1s/step - loss: 0.0162 - accuracy: 1.0000 - val_loss: 1.9700 - val_accuracy: 0.4762
Epoch 9/200
43/43 [==============================] - 45s 1s/step - loss: 0.0115 - accuracy: 0.9999 - val_loss: 2.0161 - val_accuracy: 0.4762
Epoch 10/200
43/43 [==============================] - 45s 1s/step - loss: 0.0094 - accuracy: 1.0000 - val_loss: 1.9600 - val_accuracy: 0.4894
Epoch 11/200
43/43 [==============================] - 45s 1s/step - loss: 0.0070 - accuracy: 1.0000 - val_loss: 1.9947 - val_accuracy: 0.4762
Epoch 12/200
43/43 [==============================] - 45s 1s/step - loss: 0.0058 - accuracy: 1.0000 - val_loss: 2.0783 - val_accuracy: 0.4656
Epoch 13/200
43/43 [==============================] - 45s 1s/step - loss: 0.0052 - accuracy: 0.9999 - val_loss: 1.9990 - val_accuracy: 0.4775
Epoch 14/200
43/43 [==============================] - 45s 1s/step - loss: 0.0041 - accuracy: 1.0000 - val_loss: 2.1095 - val_accuracy: 0.4706
Epoch 15/200
43/43 [==============================] - 45s 1s/step - loss: 0.0037 - accuracy: 1.0000 - val_loss: 2.1919 - val_accuracy: 0.4650
Epoch 16/200
43/43 [==============================] - 45s 1s/step - loss: 0.0030 - accuracy: 1.0000 - val_loss: 2.2744 - val_accuracy: 0.4538
Epoch 17/200
43/43 [==============================] - 45s 1s/step - loss: 0.0025 - accuracy: 1.0000 - val_loss: 2.3973 - val_accuracy: 0.4300
Epoch 18/200
43/43 [==============================] - 45s 1s/step - loss: 0.0024 - accuracy: 1.0000 - val_loss: 2.4419 - val_accuracy: 0.4225
Epoch 19/200
43/43 [==============================] - 45s 1s/step - loss: 0.0020 - accuracy: 1.0000 - val_loss: 2.4862 - val_accuracy: 0.4150
Epoch 20/200
43/43 [==============================] - 45s 1s/step - loss: 0.0018 - accuracy: 1.0000 - val_loss: 2.5569 - val_accuracy: 0.3969
Epoch 21/200
43/43 [==============================] - 45s 1s/step - loss: 0.0017 - accuracy: 1.0000 - val_loss: 2.6129 - val_accuracy: 0.3938
Epoch 22/200
43/43 [==============================] - 45s 1s/step - loss: 0.0017 - accuracy: 1.0000 - val_loss: 2.6702 - val_accuracy: 0.3769
Epoch 23/200
43/43 [==============================] - 45s 1s/step - loss: 0.0013 - accuracy: 1.0000 - val_loss: 2.7267 - val_accuracy: 0.3688
Epoch 24/200
43/43 [==============================] - 45s 1s/step - loss: 0.0012 - accuracy: 1.0000 - val_loss: 2.7610 - val_accuracy: 0.3656
In [ ]:
_, accuracy = model_report(MobileNetV2_MODEL_OPTIMIZED, MobileNetV2_MODEL_OPTIMIZED_history, test_ds_res)
accuracies_opt_200["MOBILENET_ALL"] = accuracy
Test set evaluation metrics
---------------------------
Loss:     2.063
Accuracy: 45.950%
DenseNet
In [ ]:
DENSENET_MODEL_OPTIMIZED = init_DENSENET_model_optimized(True)
DENSENET_MODEL_OPTIMIZED_history = train_model(DENSENET_MODEL_OPTIMIZED, epochs = 200, callbacks=[callback])
Model: "sequential_17"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
densenet121 (Functional)     (None, 1, 1, 1024)        7037504   
_________________________________________________________________
dropout_44 (Dropout)         (None, 1, 1, 1024)        0         
_________________________________________________________________
global_average_pooling2d_8 ( (None, 1024)              0         
_________________________________________________________________
dense_26 (Dense)             (None, 20)                20500     
=================================================================
Total params: 7,058,004
Trainable params: 6,974,356
Non-trainable params: 83,648
_________________________________________________________________
Epoch 1/200
43/43 [==============================] - 14s 98ms/step - loss: 3.8297 - accuracy: 0.0849 - val_loss: 2.7937 - val_accuracy: 0.1994
Epoch 2/200
43/43 [==============================] - 3s 64ms/step - loss: 2.3662 - accuracy: 0.2868 - val_loss: 2.1920 - val_accuracy: 0.3625
Epoch 3/200
43/43 [==============================] - 3s 64ms/step - loss: 1.6899 - accuracy: 0.4960 - val_loss: 1.8941 - val_accuracy: 0.4919
Epoch 4/200
43/43 [==============================] - 3s 64ms/step - loss: 1.2088 - accuracy: 0.6404 - val_loss: 1.6780 - val_accuracy: 0.5744
Epoch 5/200
43/43 [==============================] - 3s 64ms/step - loss: 0.8708 - accuracy: 0.7403 - val_loss: 1.4297 - val_accuracy: 0.6431
Epoch 6/200
43/43 [==============================] - 3s 64ms/step - loss: 0.5992 - accuracy: 0.8214 - val_loss: 1.2734 - val_accuracy: 0.6681
Epoch 7/200
43/43 [==============================] - 3s 64ms/step - loss: 0.4519 - accuracy: 0.8651 - val_loss: 1.1108 - val_accuracy: 0.6944
Epoch 8/200
43/43 [==============================] - 3s 64ms/step - loss: 0.3138 - accuracy: 0.9144 - val_loss: 1.0331 - val_accuracy: 0.7100
Epoch 9/200
43/43 [==============================] - 3s 64ms/step - loss: 0.2272 - accuracy: 0.9387 - val_loss: 0.9589 - val_accuracy: 0.7200
Epoch 10/200
43/43 [==============================] - 3s 64ms/step - loss: 0.1584 - accuracy: 0.9590 - val_loss: 0.9690 - val_accuracy: 0.7219
Epoch 11/200
43/43 [==============================] - 3s 64ms/step - loss: 0.1132 - accuracy: 0.9761 - val_loss: 0.9575 - val_accuracy: 0.7331
Epoch 12/200
43/43 [==============================] - 3s 64ms/step - loss: 0.0829 - accuracy: 0.9843 - val_loss: 0.9950 - val_accuracy: 0.7250
Epoch 13/200
43/43 [==============================] - 3s 64ms/step - loss: 0.0661 - accuracy: 0.9862 - val_loss: 0.9962 - val_accuracy: 0.7362
Epoch 14/200
43/43 [==============================] - 3s 64ms/step - loss: 0.0512 - accuracy: 0.9917 - val_loss: 1.0347 - val_accuracy: 0.7394
Epoch 15/200
43/43 [==============================] - 3s 64ms/step - loss: 0.0387 - accuracy: 0.9942 - val_loss: 1.0605 - val_accuracy: 0.7381
Epoch 16/200
43/43 [==============================] - 3s 64ms/step - loss: 0.0298 - accuracy: 0.9961 - val_loss: 1.1013 - val_accuracy: 0.7356
Epoch 17/200
43/43 [==============================] - 3s 64ms/step - loss: 0.0257 - accuracy: 0.9963 - val_loss: 1.0682 - val_accuracy: 0.7456
Epoch 18/200
43/43 [==============================] - 3s 65ms/step - loss: 0.0253 - accuracy: 0.9960 - val_loss: 1.1167 - val_accuracy: 0.7362
Epoch 19/200
43/43 [==============================] - 3s 64ms/step - loss: 0.0175 - accuracy: 0.9982 - val_loss: 1.1447 - val_accuracy: 0.7319
Epoch 20/200
43/43 [==============================] - 3s 64ms/step - loss: 0.0162 - accuracy: 0.9981 - val_loss: 1.1780 - val_accuracy: 0.7400
Epoch 21/200
43/43 [==============================] - 3s 64ms/step - loss: 0.0153 - accuracy: 0.9976 - val_loss: 1.1943 - val_accuracy: 0.7437
Epoch 22/200
43/43 [==============================] - 3s 64ms/step - loss: 0.0113 - accuracy: 0.9986 - val_loss: 1.1405 - val_accuracy: 0.7494
Epoch 23/200
43/43 [==============================] - 3s 64ms/step - loss: 0.0125 - accuracy: 0.9986 - val_loss: 1.1720 - val_accuracy: 0.7462
Epoch 24/200
43/43 [==============================] - 3s 64ms/step - loss: 0.0104 - accuracy: 0.9991 - val_loss: 1.1602 - val_accuracy: 0.7544
Epoch 25/200
43/43 [==============================] - 3s 64ms/step - loss: 0.0095 - accuracy: 0.9995 - val_loss: 1.1557 - val_accuracy: 0.7550
Epoch 26/200
43/43 [==============================] - 3s 64ms/step - loss: 0.0087 - accuracy: 0.9997 - val_loss: 1.1880 - val_accuracy: 0.7456
Epoch 27/200
43/43 [==============================] - 3s 65ms/step - loss: 0.0070 - accuracy: 0.9995 - val_loss: 1.2020 - val_accuracy: 0.7344
Epoch 28/200
43/43 [==============================] - 3s 64ms/step - loss: 0.0073 - accuracy: 0.9991 - val_loss: 1.2265 - val_accuracy: 0.7444
Epoch 29/200
43/43 [==============================] - 3s 64ms/step - loss: 0.0064 - accuracy: 0.9995 - val_loss: 1.1780 - val_accuracy: 0.7513
Epoch 30/200
43/43 [==============================] - 3s 64ms/step - loss: 0.0054 - accuracy: 0.9997 - val_loss: 1.2012 - val_accuracy: 0.7500
Epoch 31/200
43/43 [==============================] - 3s 65ms/step - loss: 0.0056 - accuracy: 0.9993 - val_loss: 1.2134 - val_accuracy: 0.7450
In [ ]:
_, accuracy = model_report(DENSENET_MODEL_OPTIMIZED, DENSENET_MODEL_OPTIMIZED_history)
accuracies_opt_200["DENSENET_ALL"] = accuracy
Test set evaluation metrics
---------------------------
Loss:     1.015
Accuracy: 71.750%

Bar plots σύγκρισης

In [ ]:
# set width of bar
barWidth = 0.15
model_names = ['Simple Model', 'CNN1', 'CNN2', 'VGG16', 'MobileNet', 'DenseNet']

# set height of bars
bar1 = [accuracies_opt["SIMPLE_MODEL"],accuracies_opt["CNN1"],accuracies_opt["CNN2"],accuracies_opt["VGG_ALL"],accuracies_opt["MOBILENET_ALL"],accuracies_opt["DENSENET_ALL"]]
bar2 = [accuracies_opt_64["SIMPLE_MODEL"],accuracies_opt_64["CNN1"],accuracies_opt_64["CNN2"],accuracies_opt_64["VGG_ALL"],accuracies_opt_64["MOBILENET_ALL"],accuracies_opt_64["DENSENET_ALL"]]
bar3 = [accuracies_opt_128["SIMPLE_MODEL"],accuracies_opt_128["CNN1"],accuracies_opt_128["CNN2"],accuracies_opt_128["VGG_ALL"],accuracies_opt_128["MOBILENET_ALL"],accuracies_opt_128["DENSENET_ALL"]]
bar4 = [accuracies_opt_200["SIMPLE_MODEL"],accuracies_opt_200["CNN1"],accuracies_opt_200["CNN2"],accuracies_opt_200["VGG_ALL"],accuracies_opt_200["MOBILENET_ALL"],accuracies_opt_200["DENSENET_ALL"]]

# Set position of bar on X axis
r1 = np.arange(6)
r2 = [x + barWidth for x in r1]
r3 = [x + barWidth for x in r2]
r4 = [x + barWidth for x in r3]


plt.figure(figsize=(12,5))
plt.bar(r1, bar1, color='#003f5c', width=barWidth, edgecolor='white', label = '32')
plt.bar(r2, bar2, color='#ffa600', width=barWidth, edgecolor='white', label = '64')
plt.bar(r3, bar3, color='#bc5090', width=barWidth, edgecolor='white', label = '128')
plt.bar(r4, bar4, color='#25A640', width=barWidth, edgecolor='white', label = '200')
plt.xticks([r + 1.5*barWidth for r in range(6)], model_names)
plt.ylim(bottom=0.1)
plt.legend(loc='best')
plt.title("Experiments on Batch Size")
plt.ylabel("Classification Accuracy")
plt.grid(axis="y", linestyle="--")
plt.show()

Το μέγεθος δέσμης (batch size) εκφράζει το πλήθος των δειγμάτων του dataset που φορτώνονται μαζί σε κάποιο πέρασμα κατά την διαδικασία της εκπαίδευσης. Όσο μεγαλύτερη τιμή λαμβάνει τόσο μεγαλύτερη δέσμευση μνήμης απαιτείται. Αυξάνοντάς το από 32 σε 64, 128 και τέλος σε 200 παρατηρούμε ότι η απόδοση των περισσότερων μοντέλων δεν επηρεάζεται σημαντικά. Η μόνη αξιοσημείωτη διαφορά παρατηρείται στο δίκτυο MobileNet, το οποίο φαίνεται να εμφανίζει μεγάλη πτώση στην ακρίβεια κατηγοριοποίησης για μεγάλα μεγέθη (128 και 200). Επίσης, μπορούμε να διακρίνουμε πως τα καλύτερα ποσοστά προκύπτουν στην πλειοψηφία των περιπτώσεων για τις μικρότερες τιμές του batch size (32 και 64).

Επίδραση της απόδοσης με μεταβολή του αλγορίθμου βελτιστοποίησης (optimizer)

Μέχρι στιγμής, η εκπαίδευση των δικτύων έγινε με χρήση του optimizer Adam. Πειραματιζόμαστε με διαφορετικούς αλγορίθμους βελτιστοποίησης (Nadam, SGD και RMSprop) ώστε να δούμε πως αυτοί επηρεάζουν την ακρίβεια των μοντέλων μας (test accuracy). Να σημειωθεί πως για τις ακόλουθες εκπαιδεύσεις διατηρούμε σταθερό batch size = 32, σταθερό αριθμό 20 κλάσεων, learning rate = 0.00005 και αριθμό εποχών ίσο με 200 (με χρήση Early stopping).

In [ ]:
BATCH_SIZE = 32

def _input_fn(x,y, BATCH_SIZE):
  ds = tf.data.Dataset.from_tensor_slices((x,y))
  ds = ds.shuffle(buffer_size=data_size)
  ds = ds.repeat()
  ds = ds.batch(BATCH_SIZE)
  ds = ds.prefetch(buffer_size=AUTOTUNE)
  return ds

train_ds =_input_fn(x_train,y_train, BATCH_SIZE) #PrefetchDataset object
validation_ds =_input_fn(x_val,y_val, BATCH_SIZE) #PrefetchDataset object
test_ds =_input_fn(x_test,y_test, BATCH_SIZE) #PrefetchDataset object

train_ds_res = train_ds.map(resize_transform)
validation_ds_res = validation_ds.map(resize_transform)
test_ds_res = test_ds.map(resize_transform)

def train_model(model, train_dataset = train_ds, validation_dataset = validation_ds, epochs = 100, callbacks = None, steps_per_epoch = int(np.ceil(x_train.shape[0]/BATCH_SIZE)), validation_steps = int(np.ceil(x_val.shape[0]/BATCH_SIZE))):
  history = model.fit(train_dataset, epochs=epochs, steps_per_epoch=steps_per_epoch, validation_data=validation_dataset, validation_steps=validation_steps, callbacks=callbacks)
  return(history)

def model_report(model, history, evaluation_dataset = test_ds, evaluation_steps = int(np.ceil(x_test.shape[0]/BATCH_SIZE))):
      plt = summarize_diagnostics(history)
      plt.show()
      return model_evaluation(model, evaluation_dataset, evaluation_steps)

Nadam

Δίκτυα "from scratch"

In [ ]:
accuracies_opt_Nadam = {}
Simple CNN
In [ ]:
SIMPLE_MODEL_OPTIMIZED = init_simple_model_optimized(summary = True, optimizer = tf.optimizers.Nadam)
SIMPLE_MODEL_OPTIMIZED_history = train_model(SIMPLE_MODEL_OPTIMIZED, epochs = 200, callbacks=[callback])
Model: "sequential_14"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
conv2d_10 (Conv2D)           (None, 30, 30, 32)        896       
_________________________________________________________________
batch_normalization (BatchNo (None, 30, 30, 32)        128       
_________________________________________________________________
re_lu (ReLU)                 (None, 30, 30, 32)        0         
_________________________________________________________________
max_pooling2d_7 (MaxPooling2 (None, 15, 15, 32)        0         
_________________________________________________________________
dropout_11 (Dropout)         (None, 15, 15, 32)        0         
_________________________________________________________________
conv2d_11 (Conv2D)           (None, 13, 13, 64)        18496     
_________________________________________________________________
batch_normalization_1 (Batch (None, 13, 13, 64)        256       
_________________________________________________________________
re_lu_1 (ReLU)               (None, 13, 13, 64)        0         
_________________________________________________________________
max_pooling2d_8 (MaxPooling2 (None, 6, 6, 64)          0         
_________________________________________________________________
dropout_12 (Dropout)         (None, 6, 6, 64)          0         
_________________________________________________________________
conv2d_12 (Conv2D)           (None, 4, 4, 64)          36928     
_________________________________________________________________
batch_normalization_2 (Batch (None, 4, 4, 64)          256       
_________________________________________________________________
re_lu_2 (ReLU)               (None, 4, 4, 64)          0         
_________________________________________________________________
flatten_3 (Flatten)          (None, 1024)              0         
_________________________________________________________________
dropout_13 (Dropout)         (None, 1024)              0         
_________________________________________________________________
dense_18 (Dense)             (None, 64)                65600     
_________________________________________________________________
dense_19 (Dense)             (None, 20)                1300      
=================================================================
Total params: 123,860
Trainable params: 123,540
Non-trainable params: 320
_________________________________________________________________
Epoch 1/200
266/266 [==============================] - 3s 6ms/step - loss: 4.2158 - accuracy: 0.0797 - val_loss: 4.1659 - val_accuracy: 0.0412
Epoch 2/200
266/266 [==============================] - 2s 6ms/step - loss: 3.7572 - accuracy: 0.1790 - val_loss: 3.6124 - val_accuracy: 0.2101
Epoch 3/200
266/266 [==============================] - 2s 6ms/step - loss: 3.5402 - accuracy: 0.2311 - val_loss: 3.2783 - val_accuracy: 0.2919
Epoch 4/200
266/266 [==============================] - 2s 6ms/step - loss: 3.3165 - accuracy: 0.2801 - val_loss: 3.1549 - val_accuracy: 0.3125
Epoch 5/200
266/266 [==============================] - 2s 6ms/step - loss: 3.1460 - accuracy: 0.3108 - val_loss: 3.0212 - val_accuracy: 0.3384
Epoch 6/200
266/266 [==============================] - 2s 6ms/step - loss: 3.0133 - accuracy: 0.3400 - val_loss: 2.8650 - val_accuracy: 0.3723
Epoch 7/200
266/266 [==============================] - 1s 6ms/step - loss: 2.8478 - accuracy: 0.3685 - val_loss: 2.7791 - val_accuracy: 0.3870
Epoch 8/200
266/266 [==============================] - 2s 6ms/step - loss: 2.7155 - accuracy: 0.3971 - val_loss: 2.6566 - val_accuracy: 0.4156
Epoch 9/200
266/266 [==============================] - 2s 6ms/step - loss: 2.6273 - accuracy: 0.4155 - val_loss: 2.6217 - val_accuracy: 0.4242
Epoch 10/200
266/266 [==============================] - 2s 6ms/step - loss: 2.5438 - accuracy: 0.4318 - val_loss: 2.6057 - val_accuracy: 0.4136
Epoch 11/200
266/266 [==============================] - 2s 6ms/step - loss: 2.4556 - accuracy: 0.4376 - val_loss: 2.5103 - val_accuracy: 0.4362
Epoch 12/200
266/266 [==============================] - 2s 6ms/step - loss: 2.3556 - accuracy: 0.4634 - val_loss: 2.3091 - val_accuracy: 0.4854
Epoch 13/200
266/266 [==============================] - 2s 6ms/step - loss: 2.2931 - accuracy: 0.4748 - val_loss: 2.2814 - val_accuracy: 0.4867
Epoch 14/200
266/266 [==============================] - 2s 6ms/step - loss: 2.2113 - accuracy: 0.4912 - val_loss: 2.2576 - val_accuracy: 0.4854
Epoch 15/200
266/266 [==============================] - 2s 6ms/step - loss: 2.1489 - accuracy: 0.5059 - val_loss: 2.4382 - val_accuracy: 0.4521
Epoch 16/200
266/266 [==============================] - 2s 6ms/step - loss: 2.1341 - accuracy: 0.5027 - val_loss: 2.3637 - val_accuracy: 0.4608
Epoch 17/200
266/266 [==============================] - 2s 6ms/step - loss: 2.0507 - accuracy: 0.5177 - val_loss: 2.0745 - val_accuracy: 0.5246
Epoch 18/200
266/266 [==============================] - 1s 6ms/step - loss: 1.9920 - accuracy: 0.5387 - val_loss: 2.0732 - val_accuracy: 0.5166
Epoch 19/200
266/266 [==============================] - 1s 6ms/step - loss: 1.9250 - accuracy: 0.5490 - val_loss: 2.1673 - val_accuracy: 0.5040
Epoch 20/200
266/266 [==============================] - 2s 6ms/step - loss: 1.9035 - accuracy: 0.5437 - val_loss: 1.9935 - val_accuracy: 0.5253
Epoch 21/200
266/266 [==============================] - 2s 6ms/step - loss: 1.8728 - accuracy: 0.5483 - val_loss: 2.0283 - val_accuracy: 0.5253
Epoch 22/200
266/266 [==============================] - 2s 6ms/step - loss: 1.7927 - accuracy: 0.5730 - val_loss: 2.0139 - val_accuracy: 0.5226
Epoch 23/200
266/266 [==============================] - 1s 6ms/step - loss: 1.7833 - accuracy: 0.5656 - val_loss: 2.1574 - val_accuracy: 0.4980
Epoch 24/200
266/266 [==============================] - 1s 6ms/step - loss: 1.7253 - accuracy: 0.5802 - val_loss: 1.9094 - val_accuracy: 0.5485
Epoch 25/200
266/266 [==============================] - 2s 6ms/step - loss: 1.7012 - accuracy: 0.5849 - val_loss: 1.9361 - val_accuracy: 0.5465
Epoch 26/200
266/266 [==============================] - 2s 6ms/step - loss: 1.6662 - accuracy: 0.5950 - val_loss: 2.1293 - val_accuracy: 0.4880
Epoch 27/200
266/266 [==============================] - 2s 6ms/step - loss: 1.6474 - accuracy: 0.5941 - val_loss: 2.1083 - val_accuracy: 0.4987
Epoch 28/200
266/266 [==============================] - 2s 6ms/step - loss: 1.5975 - accuracy: 0.6103 - val_loss: 1.7912 - val_accuracy: 0.5625
Epoch 29/200
266/266 [==============================] - 1s 6ms/step - loss: 1.5805 - accuracy: 0.6085 - val_loss: 1.7134 - val_accuracy: 0.5851
Epoch 30/200
266/266 [==============================] - 2s 6ms/step - loss: 1.5284 - accuracy: 0.6190 - val_loss: 1.6624 - val_accuracy: 0.5931
Epoch 31/200
266/266 [==============================] - 1s 6ms/step - loss: 1.4947 - accuracy: 0.6309 - val_loss: 1.7744 - val_accuracy: 0.5711
Epoch 32/200
266/266 [==============================] - 2s 6ms/step - loss: 1.4740 - accuracy: 0.6366 - val_loss: 1.6825 - val_accuracy: 0.5918
Epoch 33/200
266/266 [==============================] - 1s 6ms/step - loss: 1.4755 - accuracy: 0.6311 - val_loss: 1.6885 - val_accuracy: 0.5891
Epoch 34/200
266/266 [==============================] - 2s 6ms/step - loss: 1.4245 - accuracy: 0.6447 - val_loss: 1.6124 - val_accuracy: 0.6031
Epoch 35/200
266/266 [==============================] - 2s 6ms/step - loss: 1.3858 - accuracy: 0.6469 - val_loss: 1.6935 - val_accuracy: 0.5851
Epoch 36/200
266/266 [==============================] - 2s 6ms/step - loss: 1.3781 - accuracy: 0.6587 - val_loss: 1.6477 - val_accuracy: 0.5844
Epoch 37/200
266/266 [==============================] - 2s 6ms/step - loss: 1.3923 - accuracy: 0.6464 - val_loss: 1.6645 - val_accuracy: 0.5811
Epoch 38/200
266/266 [==============================] - 2s 6ms/step - loss: 1.3671 - accuracy: 0.6602 - val_loss: 1.6269 - val_accuracy: 0.6004
Epoch 39/200
266/266 [==============================] - 2s 6ms/step - loss: 1.3162 - accuracy: 0.6730 - val_loss: 1.5921 - val_accuracy: 0.6084
Epoch 40/200
266/266 [==============================] - 2s 6ms/step - loss: 1.3122 - accuracy: 0.6654 - val_loss: 1.5284 - val_accuracy: 0.6297
Epoch 41/200
266/266 [==============================] - 2s 6ms/step - loss: 1.3048 - accuracy: 0.6591 - val_loss: 1.5111 - val_accuracy: 0.6230
Epoch 42/200
266/266 [==============================] - 2s 6ms/step - loss: 1.2679 - accuracy: 0.6867 - val_loss: 1.5991 - val_accuracy: 0.6051
Epoch 43/200
266/266 [==============================] - 1s 6ms/step - loss: 1.2681 - accuracy: 0.6709 - val_loss: 1.6090 - val_accuracy: 0.5971
Epoch 44/200
266/266 [==============================] - 2s 6ms/step - loss: 1.2408 - accuracy: 0.6843 - val_loss: 1.7181 - val_accuracy: 0.5785
Epoch 45/200
266/266 [==============================] - 1s 6ms/step - loss: 1.2237 - accuracy: 0.6780 - val_loss: 1.7107 - val_accuracy: 0.5645
Epoch 46/200
266/266 [==============================] - 1s 6ms/step - loss: 1.1945 - accuracy: 0.6881 - val_loss: 1.6775 - val_accuracy: 0.5765
Epoch 47/200
266/266 [==============================] - 2s 6ms/step - loss: 1.1578 - accuracy: 0.6987 - val_loss: 1.6695 - val_accuracy: 0.5871
Epoch 48/200
266/266 [==============================] - 2s 6ms/step - loss: 1.1459 - accuracy: 0.7071 - val_loss: 1.5791 - val_accuracy: 0.6044
Epoch 49/200
266/266 [==============================] - 2s 6ms/step - loss: 1.1657 - accuracy: 0.7006 - val_loss: 1.4794 - val_accuracy: 0.6283
Epoch 50/200
266/266 [==============================] - 1s 6ms/step - loss: 1.1445 - accuracy: 0.7104 - val_loss: 1.6813 - val_accuracy: 0.5811
Epoch 51/200
266/266 [==============================] - 2s 6ms/step - loss: 1.1241 - accuracy: 0.7041 - val_loss: 1.4753 - val_accuracy: 0.6290
Epoch 52/200
266/266 [==============================] - 2s 6ms/step - loss: 1.0837 - accuracy: 0.7174 - val_loss: 1.4986 - val_accuracy: 0.6230
Epoch 53/200
266/266 [==============================] - 1s 6ms/step - loss: 1.1219 - accuracy: 0.7066 - val_loss: 1.4267 - val_accuracy: 0.6323
Epoch 54/200
266/266 [==============================] - 2s 6ms/step - loss: 1.1122 - accuracy: 0.7092 - val_loss: 1.4510 - val_accuracy: 0.6290
Epoch 55/200
266/266 [==============================] - 2s 6ms/step - loss: 1.0926 - accuracy: 0.7165 - val_loss: 1.5120 - val_accuracy: 0.6316
Epoch 56/200
266/266 [==============================] - 2s 6ms/step - loss: 1.0929 - accuracy: 0.7168 - val_loss: 1.5176 - val_accuracy: 0.6157
Epoch 57/200
266/266 [==============================] - 2s 6ms/step - loss: 1.0456 - accuracy: 0.7300 - val_loss: 1.4317 - val_accuracy: 0.6316
Epoch 58/200
266/266 [==============================] - 2s 6ms/step - loss: 1.0498 - accuracy: 0.7240 - val_loss: 1.3919 - val_accuracy: 0.6483
Epoch 59/200
266/266 [==============================] - 2s 6ms/step - loss: 1.0152 - accuracy: 0.7376 - val_loss: 1.4051 - val_accuracy: 0.6456
Epoch 60/200
266/266 [==============================] - 2s 6ms/step - loss: 1.0281 - accuracy: 0.7330 - val_loss: 1.6794 - val_accuracy: 0.5858
Epoch 61/200
266/266 [==============================] - 2s 6ms/step - loss: 1.0163 - accuracy: 0.7359 - val_loss: 1.3315 - val_accuracy: 0.6702
Epoch 62/200
266/266 [==============================] - 2s 6ms/step - loss: 0.9903 - accuracy: 0.7370 - val_loss: 1.4537 - val_accuracy: 0.6383
Epoch 63/200
266/266 [==============================] - 2s 6ms/step - loss: 0.9805 - accuracy: 0.7418 - val_loss: 1.4334 - val_accuracy: 0.6396
Epoch 64/200
266/266 [==============================] - 2s 6ms/step - loss: 0.9832 - accuracy: 0.7429 - val_loss: 1.4780 - val_accuracy: 0.6343
Epoch 65/200
266/266 [==============================] - 2s 6ms/step - loss: 0.9828 - accuracy: 0.7348 - val_loss: 1.4213 - val_accuracy: 0.6423
Epoch 66/200
266/266 [==============================] - 2s 6ms/step - loss: 0.9813 - accuracy: 0.7471 - val_loss: 1.3121 - val_accuracy: 0.6616
Epoch 67/200
266/266 [==============================] - 2s 6ms/step - loss: 0.9221 - accuracy: 0.7631 - val_loss: 1.3616 - val_accuracy: 0.6582
Epoch 68/200
266/266 [==============================] - 2s 6ms/step - loss: 0.9341 - accuracy: 0.7550 - val_loss: 1.3727 - val_accuracy: 0.6569
Epoch 69/200
266/266 [==============================] - 2s 6ms/step - loss: 0.9305 - accuracy: 0.7494 - val_loss: 1.3849 - val_accuracy: 0.6556
Epoch 70/200
266/266 [==============================] - 2s 6ms/step - loss: 0.9343 - accuracy: 0.7553 - val_loss: 1.3311 - val_accuracy: 0.6636
Epoch 71/200
266/266 [==============================] - 2s 6ms/step - loss: 0.9063 - accuracy: 0.7649 - val_loss: 1.3709 - val_accuracy: 0.6609
Epoch 72/200
266/266 [==============================] - 2s 6ms/step - loss: 0.9030 - accuracy: 0.7548 - val_loss: 1.5139 - val_accuracy: 0.6250
Epoch 73/200
266/266 [==============================] - 2s 6ms/step - loss: 0.9104 - accuracy: 0.7550 - val_loss: 1.3695 - val_accuracy: 0.6456
Epoch 74/200
266/266 [==============================] - 2s 6ms/step - loss: 0.8600 - accuracy: 0.7743 - val_loss: 1.3404 - val_accuracy: 0.6629
Epoch 75/200
266/266 [==============================] - 2s 6ms/step - loss: 0.8801 - accuracy: 0.7677 - val_loss: 1.3149 - val_accuracy: 0.6722
Epoch 76/200
266/266 [==============================] - 2s 6ms/step - loss: 0.9057 - accuracy: 0.7617 - val_loss: 1.3952 - val_accuracy: 0.6516
Epoch 77/200
266/266 [==============================] - 2s 6ms/step - loss: 0.8702 - accuracy: 0.7726 - val_loss: 1.3834 - val_accuracy: 0.6642
Epoch 78/200
266/266 [==============================] - 2s 6ms/step - loss: 0.8668 - accuracy: 0.7687 - val_loss: 1.3379 - val_accuracy: 0.6576
Epoch 79/200
266/266 [==============================] - 2s 6ms/step - loss: 0.8486 - accuracy: 0.7736 - val_loss: 1.3260 - val_accuracy: 0.6709
Epoch 80/200
266/266 [==============================] - 2s 6ms/step - loss: 0.8220 - accuracy: 0.7827 - val_loss: 1.4799 - val_accuracy: 0.6469
Epoch 81/200
266/266 [==============================] - 2s 6ms/step - loss: 0.8259 - accuracy: 0.7823 - val_loss: 1.3430 - val_accuracy: 0.6682
Epoch 82/200
266/266 [==============================] - 2s 6ms/step - loss: 0.8445 - accuracy: 0.7772 - val_loss: 1.3481 - val_accuracy: 0.6589
Epoch 83/200
266/266 [==============================] - 2s 6ms/step - loss: 0.8074 - accuracy: 0.7933 - val_loss: 1.2850 - val_accuracy: 0.6636
Epoch 84/200
266/266 [==============================] - 2s 6ms/step - loss: 0.8073 - accuracy: 0.7786 - val_loss: 1.3369 - val_accuracy: 0.6602
Epoch 85/200
266/266 [==============================] - 2s 6ms/step - loss: 0.8311 - accuracy: 0.7785 - val_loss: 1.3302 - val_accuracy: 0.6622
Epoch 86/200
266/266 [==============================] - 2s 6ms/step - loss: 0.7961 - accuracy: 0.7878 - val_loss: 1.2848 - val_accuracy: 0.6709
Epoch 87/200
266/266 [==============================] - 2s 6ms/step - loss: 0.8082 - accuracy: 0.7915 - val_loss: 1.3668 - val_accuracy: 0.6503
Epoch 88/200
266/266 [==============================] - 2s 6ms/step - loss: 0.7795 - accuracy: 0.7930 - val_loss: 1.3641 - val_accuracy: 0.6536
Epoch 89/200
266/266 [==============================] - 2s 6ms/step - loss: 0.7902 - accuracy: 0.7884 - val_loss: 1.3482 - val_accuracy: 0.6489
Epoch 90/200
266/266 [==============================] - 2s 6ms/step - loss: 0.7785 - accuracy: 0.7906 - val_loss: 1.3006 - val_accuracy: 0.6762
Epoch 91/200
266/266 [==============================] - 2s 6ms/step - loss: 0.7556 - accuracy: 0.7985 - val_loss: 1.2878 - val_accuracy: 0.6656
Epoch 92/200
266/266 [==============================] - 2s 6ms/step - loss: 0.7633 - accuracy: 0.8011 - val_loss: 1.3892 - val_accuracy: 0.6543
Epoch 93/200
266/266 [==============================] - 2s 6ms/step - loss: 0.7423 - accuracy: 0.8110 - val_loss: 1.3216 - val_accuracy: 0.6669
Epoch 94/200
266/266 [==============================] - 2s 6ms/step - loss: 0.7836 - accuracy: 0.7840 - val_loss: 1.4651 - val_accuracy: 0.6350
Epoch 95/200
266/266 [==============================] - 2s 6ms/step - loss: 0.7554 - accuracy: 0.7958 - val_loss: 1.3063 - val_accuracy: 0.6682
Epoch 96/200
266/266 [==============================] - 2s 6ms/step - loss: 0.7432 - accuracy: 0.8058 - val_loss: 1.3493 - val_accuracy: 0.6562
Epoch 97/200
266/266 [==============================] - 2s 6ms/step - loss: 0.7556 - accuracy: 0.7955 - val_loss: 1.3220 - val_accuracy: 0.6676
Epoch 98/200
266/266 [==============================] - 2s 6ms/step - loss: 0.7190 - accuracy: 0.8130 - val_loss: 1.3389 - val_accuracy: 0.6562
Epoch 99/200
266/266 [==============================] - 2s 6ms/step - loss: 0.7393 - accuracy: 0.8040 - val_loss: 1.3405 - val_accuracy: 0.6636
Epoch 100/200
266/266 [==============================] - 2s 6ms/step - loss: 0.7364 - accuracy: 0.8019 - val_loss: 1.4356 - val_accuracy: 0.6410
Epoch 101/200
266/266 [==============================] - 2s 6ms/step - loss: 0.7172 - accuracy: 0.8123 - val_loss: 1.3981 - val_accuracy: 0.6622
Epoch 102/200
266/266 [==============================] - 1s 6ms/step - loss: 0.7139 - accuracy: 0.8106 - val_loss: 1.3106 - val_accuracy: 0.6609
Epoch 103/200
266/266 [==============================] - 2s 6ms/step - loss: 0.7263 - accuracy: 0.8081 - val_loss: 1.3197 - val_accuracy: 0.6722
Epoch 104/200
266/266 [==============================] - 2s 6ms/step - loss: 0.7145 - accuracy: 0.8021 - val_loss: 1.2902 - val_accuracy: 0.6729
Epoch 105/200
266/266 [==============================] - 2s 6ms/step - loss: 0.7099 - accuracy: 0.8165 - val_loss: 1.2821 - val_accuracy: 0.6715
Epoch 106/200
266/266 [==============================] - 2s 6ms/step - loss: 0.7027 - accuracy: 0.8104 - val_loss: 1.5537 - val_accuracy: 0.6243
Epoch 107/200
266/266 [==============================] - 2s 6ms/step - loss: 0.6944 - accuracy: 0.8129 - val_loss: 1.3216 - val_accuracy: 0.6702
Epoch 108/200
266/266 [==============================] - 2s 6ms/step - loss: 0.7012 - accuracy: 0.8173 - val_loss: 1.3037 - val_accuracy: 0.6782
Epoch 109/200
266/266 [==============================] - 2s 6ms/step - loss: 0.6764 - accuracy: 0.8238 - val_loss: 1.3159 - val_accuracy: 0.6729
Epoch 110/200
266/266 [==============================] - 2s 6ms/step - loss: 0.7114 - accuracy: 0.8034 - val_loss: 1.3918 - val_accuracy: 0.6642
Epoch 111/200
266/266 [==============================] - 2s 6ms/step - loss: 0.6796 - accuracy: 0.8217 - val_loss: 1.3010 - val_accuracy: 0.6815
Epoch 112/200
266/266 [==============================] - 2s 6ms/step - loss: 0.6792 - accuracy: 0.8213 - val_loss: 1.3827 - val_accuracy: 0.6662
Epoch 113/200
266/266 [==============================] - 2s 6ms/step - loss: 0.6615 - accuracy: 0.8300 - val_loss: 1.3993 - val_accuracy: 0.6676
Epoch 114/200
266/266 [==============================] - 2s 6ms/step - loss: 0.6859 - accuracy: 0.8184 - val_loss: 1.3145 - val_accuracy: 0.6616
Epoch 115/200
266/266 [==============================] - 2s 6ms/step - loss: 0.6683 - accuracy: 0.8267 - val_loss: 1.4155 - val_accuracy: 0.6556
Epoch 116/200
266/266 [==============================] - 2s 6ms/step - loss: 0.6697 - accuracy: 0.8207 - val_loss: 1.2978 - val_accuracy: 0.6749
Epoch 117/200
266/266 [==============================] - 2s 6ms/step - loss: 0.6765 - accuracy: 0.8176 - val_loss: 1.4239 - val_accuracy: 0.6456
Epoch 118/200
266/266 [==============================] - 2s 6ms/step - loss: 0.6461 - accuracy: 0.8287 - val_loss: 1.4052 - val_accuracy: 0.6483
Epoch 119/200
266/266 [==============================] - 2s 6ms/step - loss: 0.6547 - accuracy: 0.8281 - val_loss: 1.4452 - val_accuracy: 0.6449
Epoch 120/200
266/266 [==============================] - 2s 6ms/step - loss: 0.6547 - accuracy: 0.8275 - val_loss: 1.4299 - val_accuracy: 0.6562
Epoch 121/200
266/266 [==============================] - 2s 6ms/step - loss: 0.6562 - accuracy: 0.8263 - val_loss: 1.3287 - val_accuracy: 0.6802
Epoch 122/200
266/266 [==============================] - 2s 6ms/step - loss: 0.6417 - accuracy: 0.8335 - val_loss: 1.3226 - val_accuracy: 0.6749
Epoch 123/200
266/266 [==============================] - 2s 6ms/step - loss: 0.6476 - accuracy: 0.8366 - val_loss: 1.3773 - val_accuracy: 0.6616
Epoch 124/200
266/266 [==============================] - 2s 6ms/step - loss: 0.6405 - accuracy: 0.8273 - val_loss: 1.2787 - val_accuracy: 0.6749
Epoch 125/200
266/266 [==============================] - 2s 6ms/step - loss: 0.6227 - accuracy: 0.8390 - val_loss: 1.3553 - val_accuracy: 0.6616
Epoch 126/200
266/266 [==============================] - 2s 6ms/step - loss: 0.6257 - accuracy: 0.8314 - val_loss: 1.3159 - val_accuracy: 0.6815
Epoch 127/200
266/266 [==============================] - 2s 6ms/step - loss: 0.6234 - accuracy: 0.8387 - val_loss: 1.3622 - val_accuracy: 0.6596
Epoch 128/200
266/266 [==============================] - 2s 6ms/step - loss: 0.6143 - accuracy: 0.8412 - val_loss: 1.3603 - val_accuracy: 0.6702
Epoch 129/200
266/266 [==============================] - 2s 6ms/step - loss: 0.6150 - accuracy: 0.8347 - val_loss: 1.3440 - val_accuracy: 0.6642
Epoch 130/200
266/266 [==============================] - 2s 6ms/step - loss: 0.6147 - accuracy: 0.8437 - val_loss: 1.4269 - val_accuracy: 0.6476
Epoch 131/200
266/266 [==============================] - 2s 6ms/step - loss: 0.6190 - accuracy: 0.8352 - val_loss: 1.3276 - val_accuracy: 0.6822
Epoch 132/200
266/266 [==============================] - 2s 6ms/step - loss: 0.6092 - accuracy: 0.8400 - val_loss: 1.3853 - val_accuracy: 0.6549
Epoch 133/200
266/266 [==============================] - 2s 6ms/step - loss: 0.6143 - accuracy: 0.8332 - val_loss: 1.2534 - val_accuracy: 0.6895
Epoch 134/200
266/266 [==============================] - 2s 6ms/step - loss: 0.6280 - accuracy: 0.8322 - val_loss: 1.3862 - val_accuracy: 0.6636
Epoch 135/200
266/266 [==============================] - 2s 6ms/step - loss: 0.6028 - accuracy: 0.8419 - val_loss: 1.3458 - val_accuracy: 0.6602
Epoch 136/200
266/266 [==============================] - 2s 6ms/step - loss: 0.5817 - accuracy: 0.8484 - val_loss: 1.3588 - val_accuracy: 0.6749
Epoch 137/200
266/266 [==============================] - 2s 6ms/step - loss: 0.6117 - accuracy: 0.8412 - val_loss: 1.4775 - val_accuracy: 0.6423
Epoch 138/200
266/266 [==============================] - 2s 6ms/step - loss: 0.5927 - accuracy: 0.8490 - val_loss: 1.3492 - val_accuracy: 0.6629
Epoch 139/200
266/266 [==============================] - 2s 6ms/step - loss: 0.5906 - accuracy: 0.8485 - val_loss: 1.4619 - val_accuracy: 0.6503
Epoch 140/200
266/266 [==============================] - 2s 6ms/step - loss: 0.6095 - accuracy: 0.8365 - val_loss: 1.3848 - val_accuracy: 0.6682
Epoch 141/200
266/266 [==============================] - 1s 6ms/step - loss: 0.6060 - accuracy: 0.8422 - val_loss: 1.4473 - val_accuracy: 0.6529
Epoch 142/200
266/266 [==============================] - 2s 6ms/step - loss: 0.5948 - accuracy: 0.8485 - val_loss: 1.2815 - val_accuracy: 0.6835
Epoch 143/200
266/266 [==============================] - 2s 6ms/step - loss: 0.5788 - accuracy: 0.8475 - val_loss: 1.3481 - val_accuracy: 0.6695
Epoch 144/200
266/266 [==============================] - 2s 6ms/step - loss: 0.5877 - accuracy: 0.8445 - val_loss: 1.3009 - val_accuracy: 0.6742
Epoch 145/200
266/266 [==============================] - 2s 6ms/step - loss: 0.5840 - accuracy: 0.8423 - val_loss: 1.2962 - val_accuracy: 0.6868
Epoch 146/200
266/266 [==============================] - 2s 6ms/step - loss: 0.5842 - accuracy: 0.8534 - val_loss: 1.2912 - val_accuracy: 0.6809
Epoch 147/200
266/266 [==============================] - 2s 6ms/step - loss: 0.5712 - accuracy: 0.8510 - val_loss: 1.3475 - val_accuracy: 0.6656
Epoch 148/200
266/266 [==============================] - 2s 6ms/step - loss: 0.5910 - accuracy: 0.8436 - val_loss: 1.5346 - val_accuracy: 0.6423
Epoch 149/200
266/266 [==============================] - 2s 6ms/step - loss: 0.5669 - accuracy: 0.8479 - val_loss: 1.3830 - val_accuracy: 0.6762
Epoch 150/200
266/266 [==============================] - 2s 6ms/step - loss: 0.5639 - accuracy: 0.8469 - val_loss: 1.3086 - val_accuracy: 0.6769
Epoch 151/200
266/266 [==============================] - 2s 6ms/step - loss: 0.5923 - accuracy: 0.8435 - val_loss: 1.3385 - val_accuracy: 0.6775
Epoch 152/200
266/266 [==============================] - 2s 6ms/step - loss: 0.5562 - accuracy: 0.8528 - val_loss: 1.2859 - val_accuracy: 0.6941
Epoch 153/200
266/266 [==============================] - 2s 6ms/step - loss: 0.5368 - accuracy: 0.8677 - val_loss: 1.3238 - val_accuracy: 0.6722
In [ ]:
_, accuracy = model_report(SIMPLE_MODEL_OPTIMIZED, SIMPLE_MODEL_OPTIMIZED_history)
accuracies_opt_Nadam["SIMPLE_MODEL"] = accuracy
Test set evaluation metrics
---------------------------
Loss:     1.265
Accuracy: 70.040%
CNN1
In [ ]:
CNN1_MODEL_OPTIMIZED = init_cnn1_model_optimized(summary = True, optimizer = tf.optimizers.Nadam)
CNN1_MODEL_OPTIMIZED_history = train_model(CNN1_MODEL_OPTIMIZED, epochs = 200, callbacks=[callback])
Model: "sequential_15"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
conv2d_13 (Conv2D)           (None, 30, 30, 32)        896       
_________________________________________________________________
batch_normalization_3 (Batch (None, 30, 30, 32)        128       
_________________________________________________________________
re_lu_3 (ReLU)               (None, 30, 30, 32)        0         
_________________________________________________________________
max_pooling2d_9 (MaxPooling2 (None, 15, 15, 32)        0         
_________________________________________________________________
dropout_14 (Dropout)         (None, 15, 15, 32)        0         
_________________________________________________________________
conv2d_14 (Conv2D)           (None, 13, 13, 64)        18496     
_________________________________________________________________
batch_normalization_4 (Batch (None, 13, 13, 64)        256       
_________________________________________________________________
re_lu_4 (ReLU)               (None, 13, 13, 64)        0         
_________________________________________________________________
max_pooling2d_10 (MaxPooling (None, 6, 6, 64)          0         
_________________________________________________________________
dropout_15 (Dropout)         (None, 6, 6, 64)          0         
_________________________________________________________________
conv2d_15 (Conv2D)           (None, 4, 4, 128)         73856     
_________________________________________________________________
batch_normalization_5 (Batch (None, 4, 4, 128)         512       
_________________________________________________________________
re_lu_5 (ReLU)               (None, 4, 4, 128)         0         
_________________________________________________________________
average_pooling2d_1 (Average (None, 2, 2, 128)         0         
_________________________________________________________________
dropout_16 (Dropout)         (None, 2, 2, 128)         0         
_________________________________________________________________
flatten_4 (Flatten)          (None, 512)               0         
_________________________________________________________________
dense_20 (Dense)             (None, 1024)              525312    
_________________________________________________________________
dropout_17 (Dropout)         (None, 1024)              0         
_________________________________________________________________
dense_21 (Dense)             (None, 20)                20500     
=================================================================
Total params: 639,956
Trainable params: 639,508
Non-trainable params: 448
_________________________________________________________________
Epoch 1/200
266/266 [==============================] - 3s 6ms/step - loss: 4.2030 - accuracy: 0.1176 - val_loss: 4.3633 - val_accuracy: 0.0924
Epoch 2/200
266/266 [==============================] - 1s 6ms/step - loss: 3.5872 - accuracy: 0.2656 - val_loss: 3.6091 - val_accuracy: 0.2287
Epoch 3/200
266/266 [==============================] - 1s 6ms/step - loss: 3.2503 - accuracy: 0.3368 - val_loss: 3.0973 - val_accuracy: 0.3551
Epoch 4/200
266/266 [==============================] - 1s 6ms/step - loss: 2.9765 - accuracy: 0.3914 - val_loss: 3.0214 - val_accuracy: 0.3644
Epoch 5/200
266/266 [==============================] - 2s 6ms/step - loss: 2.7840 - accuracy: 0.4164 - val_loss: 2.8966 - val_accuracy: 0.3803
Epoch 6/200
266/266 [==============================] - 2s 6ms/step - loss: 2.6099 - accuracy: 0.4494 - val_loss: 2.8311 - val_accuracy: 0.4056
Epoch 7/200
266/266 [==============================] - 2s 6ms/step - loss: 2.4875 - accuracy: 0.4696 - val_loss: 2.5455 - val_accuracy: 0.4601
Epoch 8/200
266/266 [==============================] - 2s 6ms/step - loss: 2.3551 - accuracy: 0.4968 - val_loss: 2.3642 - val_accuracy: 0.4967
Epoch 9/200
266/266 [==============================] - 1s 6ms/step - loss: 2.2747 - accuracy: 0.5042 - val_loss: 2.5061 - val_accuracy: 0.4501
Epoch 10/200
266/266 [==============================] - 2s 6ms/step - loss: 2.1700 - accuracy: 0.5304 - val_loss: 2.2074 - val_accuracy: 0.5246
Epoch 11/200
266/266 [==============================] - 2s 6ms/step - loss: 2.1054 - accuracy: 0.5304 - val_loss: 2.3046 - val_accuracy: 0.4867
Epoch 12/200
266/266 [==============================] - 2s 6ms/step - loss: 2.0406 - accuracy: 0.5464 - val_loss: 2.0604 - val_accuracy: 0.5372
Epoch 13/200
266/266 [==============================] - 2s 6ms/step - loss: 1.9652 - accuracy: 0.5425 - val_loss: 2.1585 - val_accuracy: 0.5120
Epoch 14/200
266/266 [==============================] - 2s 6ms/step - loss: 1.9155 - accuracy: 0.5587 - val_loss: 2.1162 - val_accuracy: 0.5199
Epoch 15/200
266/266 [==============================] - 2s 6ms/step - loss: 1.8506 - accuracy: 0.5727 - val_loss: 1.8995 - val_accuracy: 0.5638
Epoch 16/200
266/266 [==============================] - 2s 6ms/step - loss: 1.7967 - accuracy: 0.5894 - val_loss: 2.0503 - val_accuracy: 0.5259
Epoch 17/200
266/266 [==============================] - 2s 6ms/step - loss: 1.7219 - accuracy: 0.6097 - val_loss: 2.0555 - val_accuracy: 0.5153
Epoch 18/200
266/266 [==============================] - 2s 6ms/step - loss: 1.7007 - accuracy: 0.6032 - val_loss: 2.0774 - val_accuracy: 0.5113
Epoch 19/200
266/266 [==============================] - 2s 6ms/step - loss: 1.6375 - accuracy: 0.6107 - val_loss: 1.7282 - val_accuracy: 0.5931
Epoch 20/200
266/266 [==============================] - 2s 6ms/step - loss: 1.6092 - accuracy: 0.6143 - val_loss: 1.9594 - val_accuracy: 0.5432
Epoch 21/200
266/266 [==============================] - 2s 6ms/step - loss: 1.5855 - accuracy: 0.6241 - val_loss: 2.0231 - val_accuracy: 0.5160
Epoch 22/200
266/266 [==============================] - 1s 6ms/step - loss: 1.5475 - accuracy: 0.6331 - val_loss: 1.7244 - val_accuracy: 0.5785
Epoch 23/200
266/266 [==============================] - 1s 6ms/step - loss: 1.4967 - accuracy: 0.6434 - val_loss: 1.7164 - val_accuracy: 0.5891
Epoch 24/200
266/266 [==============================] - 2s 6ms/step - loss: 1.4667 - accuracy: 0.6407 - val_loss: 1.7556 - val_accuracy: 0.5745
Epoch 25/200
266/266 [==============================] - 2s 6ms/step - loss: 1.4175 - accuracy: 0.6605 - val_loss: 1.8062 - val_accuracy: 0.5652
Epoch 26/200
266/266 [==============================] - 2s 6ms/step - loss: 1.4191 - accuracy: 0.6479 - val_loss: 1.7438 - val_accuracy: 0.5725
Epoch 27/200
266/266 [==============================] - 2s 6ms/step - loss: 1.3503 - accuracy: 0.6697 - val_loss: 1.6674 - val_accuracy: 0.6004
Epoch 28/200
266/266 [==============================] - 1s 6ms/step - loss: 1.3478 - accuracy: 0.6737 - val_loss: 1.7754 - val_accuracy: 0.5525
Epoch 29/200
266/266 [==============================] - 1s 6ms/step - loss: 1.3308 - accuracy: 0.6703 - val_loss: 1.7518 - val_accuracy: 0.5758
Epoch 30/200
266/266 [==============================] - 2s 6ms/step - loss: 1.3059 - accuracy: 0.6786 - val_loss: 1.5400 - val_accuracy: 0.6031
Epoch 31/200
266/266 [==============================] - 2s 6ms/step - loss: 1.2647 - accuracy: 0.6814 - val_loss: 1.6098 - val_accuracy: 0.5951
Epoch 32/200
266/266 [==============================] - 2s 6ms/step - loss: 1.2650 - accuracy: 0.6859 - val_loss: 1.7128 - val_accuracy: 0.5891
Epoch 33/200
266/266 [==============================] - 2s 6ms/step - loss: 1.2317 - accuracy: 0.6933 - val_loss: 1.6703 - val_accuracy: 0.6017
Epoch 34/200
266/266 [==============================] - 2s 6ms/step - loss: 1.2080 - accuracy: 0.6968 - val_loss: 1.4790 - val_accuracy: 0.6330
Epoch 35/200
266/266 [==============================] - 2s 6ms/step - loss: 1.1964 - accuracy: 0.6969 - val_loss: 1.4917 - val_accuracy: 0.6283
Epoch 36/200
266/266 [==============================] - 2s 6ms/step - loss: 1.1640 - accuracy: 0.7037 - val_loss: 1.3911 - val_accuracy: 0.6503
Epoch 37/200
266/266 [==============================] - 2s 6ms/step - loss: 1.1387 - accuracy: 0.7100 - val_loss: 1.4540 - val_accuracy: 0.6323
Epoch 38/200
266/266 [==============================] - 2s 6ms/step - loss: 1.1195 - accuracy: 0.7175 - val_loss: 1.4910 - val_accuracy: 0.6184
Epoch 39/200
266/266 [==============================] - 2s 6ms/step - loss: 1.1243 - accuracy: 0.7135 - val_loss: 1.4321 - val_accuracy: 0.6476
Epoch 40/200
266/266 [==============================] - 2s 6ms/step - loss: 1.1024 - accuracy: 0.7261 - val_loss: 1.4386 - val_accuracy: 0.6390
Epoch 41/200
266/266 [==============================] - 2s 6ms/step - loss: 1.0872 - accuracy: 0.7285 - val_loss: 1.6703 - val_accuracy: 0.5911
Epoch 42/200
266/266 [==============================] - 2s 6ms/step - loss: 1.0659 - accuracy: 0.7338 - val_loss: 1.4899 - val_accuracy: 0.6210
Epoch 43/200
266/266 [==============================] - 2s 6ms/step - loss: 1.0818 - accuracy: 0.7216 - val_loss: 1.5128 - val_accuracy: 0.6303
Epoch 44/200
266/266 [==============================] - 2s 6ms/step - loss: 1.0201 - accuracy: 0.7406 - val_loss: 1.4971 - val_accuracy: 0.6230
Epoch 45/200
266/266 [==============================] - 2s 6ms/step - loss: 1.0357 - accuracy: 0.7399 - val_loss: 1.4421 - val_accuracy: 0.6336
Epoch 46/200
266/266 [==============================] - 2s 6ms/step - loss: 1.0339 - accuracy: 0.7301 - val_loss: 1.3479 - val_accuracy: 0.6556
Epoch 47/200
266/266 [==============================] - 2s 6ms/step - loss: 0.9987 - accuracy: 0.7398 - val_loss: 1.3013 - val_accuracy: 0.6636
Epoch 48/200
266/266 [==============================] - 2s 6ms/step - loss: 0.9836 - accuracy: 0.7449 - val_loss: 1.4035 - val_accuracy: 0.6476
Epoch 49/200
266/266 [==============================] - 2s 6ms/step - loss: 0.9584 - accuracy: 0.7555 - val_loss: 1.3838 - val_accuracy: 0.6430
Epoch 50/200
266/266 [==============================] - 2s 6ms/step - loss: 0.9610 - accuracy: 0.7553 - val_loss: 1.3722 - val_accuracy: 0.6602
Epoch 51/200
266/266 [==============================] - 2s 6ms/step - loss: 0.9525 - accuracy: 0.7452 - val_loss: 1.3435 - val_accuracy: 0.6543
Epoch 52/200
266/266 [==============================] - 2s 6ms/step - loss: 0.9339 - accuracy: 0.7568 - val_loss: 1.3550 - val_accuracy: 0.6556
Epoch 53/200
266/266 [==============================] - 2s 6ms/step - loss: 0.9327 - accuracy: 0.7583 - val_loss: 1.2819 - val_accuracy: 0.6715
Epoch 54/200
266/266 [==============================] - 1s 6ms/step - loss: 0.9035 - accuracy: 0.7716 - val_loss: 1.3544 - val_accuracy: 0.6549
Epoch 55/200
266/266 [==============================] - 2s 6ms/step - loss: 0.9076 - accuracy: 0.7668 - val_loss: 1.3768 - val_accuracy: 0.6509
Epoch 56/200
266/266 [==============================] - 2s 6ms/step - loss: 0.8973 - accuracy: 0.7583 - val_loss: 1.3338 - val_accuracy: 0.6496
Epoch 57/200
266/266 [==============================] - 2s 6ms/step - loss: 0.8568 - accuracy: 0.7818 - val_loss: 1.2903 - val_accuracy: 0.6742
Epoch 58/200
266/266 [==============================] - 2s 6ms/step - loss: 0.8799 - accuracy: 0.7727 - val_loss: 1.2677 - val_accuracy: 0.6828
Epoch 59/200
266/266 [==============================] - 2s 6ms/step - loss: 0.8477 - accuracy: 0.7854 - val_loss: 1.2265 - val_accuracy: 0.6895
Epoch 60/200
266/266 [==============================] - 2s 6ms/step - loss: 0.8755 - accuracy: 0.7784 - val_loss: 1.4603 - val_accuracy: 0.6436
Epoch 61/200
266/266 [==============================] - 2s 6ms/step - loss: 0.8354 - accuracy: 0.7841 - val_loss: 1.2910 - val_accuracy: 0.6676
Epoch 62/200
266/266 [==============================] - 2s 6ms/step - loss: 0.8058 - accuracy: 0.7890 - val_loss: 1.2056 - val_accuracy: 0.6915
Epoch 63/200
266/266 [==============================] - 2s 6ms/step - loss: 0.8223 - accuracy: 0.7879 - val_loss: 1.2979 - val_accuracy: 0.6636
Epoch 64/200
266/266 [==============================] - 2s 6ms/step - loss: 0.8048 - accuracy: 0.7876 - val_loss: 1.2175 - val_accuracy: 0.6902
Epoch 65/200
266/266 [==============================] - 2s 6ms/step - loss: 0.8148 - accuracy: 0.7830 - val_loss: 1.2705 - val_accuracy: 0.6775
Epoch 66/200
266/266 [==============================] - 2s 6ms/step - loss: 0.7918 - accuracy: 0.7977 - val_loss: 1.3066 - val_accuracy: 0.6682
Epoch 67/200
266/266 [==============================] - 2s 6ms/step - loss: 0.7862 - accuracy: 0.8019 - val_loss: 1.2243 - val_accuracy: 0.6815
Epoch 68/200
266/266 [==============================] - 2s 6ms/step - loss: 0.7852 - accuracy: 0.7968 - val_loss: 1.2567 - val_accuracy: 0.6689
Epoch 69/200
266/266 [==============================] - 2s 6ms/step - loss: 0.7655 - accuracy: 0.7983 - val_loss: 1.3024 - val_accuracy: 0.6702
Epoch 70/200
266/266 [==============================] - 2s 6ms/step - loss: 0.7342 - accuracy: 0.8135 - val_loss: 1.2437 - val_accuracy: 0.6789
Epoch 71/200
266/266 [==============================] - 2s 6ms/step - loss: 0.7561 - accuracy: 0.8079 - val_loss: 1.3076 - val_accuracy: 0.6682
Epoch 72/200
266/266 [==============================] - 2s 6ms/step - loss: 0.7516 - accuracy: 0.8049 - val_loss: 1.2091 - val_accuracy: 0.6895
Epoch 73/200
266/266 [==============================] - 2s 6ms/step - loss: 0.7366 - accuracy: 0.8052 - val_loss: 1.2664 - val_accuracy: 0.6882
Epoch 74/200
266/266 [==============================] - 2s 6ms/step - loss: 0.7334 - accuracy: 0.8175 - val_loss: 1.2690 - val_accuracy: 0.6809
Epoch 75/200
266/266 [==============================] - 2s 6ms/step - loss: 0.7300 - accuracy: 0.8105 - val_loss: 1.2245 - val_accuracy: 0.6795
Epoch 76/200
266/266 [==============================] - 2s 6ms/step - loss: 0.7093 - accuracy: 0.8140 - val_loss: 1.2478 - val_accuracy: 0.6895
Epoch 77/200
266/266 [==============================] - 2s 6ms/step - loss: 0.7170 - accuracy: 0.8141 - val_loss: 1.3068 - val_accuracy: 0.6709
Epoch 78/200
266/266 [==============================] - 2s 6ms/step - loss: 0.6896 - accuracy: 0.8244 - val_loss: 1.2509 - val_accuracy: 0.6822
Epoch 79/200
266/266 [==============================] - 2s 6ms/step - loss: 0.6899 - accuracy: 0.8325 - val_loss: 1.3025 - val_accuracy: 0.6742
Epoch 80/200
266/266 [==============================] - 2s 6ms/step - loss: 0.7217 - accuracy: 0.8128 - val_loss: 1.2526 - val_accuracy: 0.6822
Epoch 81/200
266/266 [==============================] - 2s 6ms/step - loss: 0.6793 - accuracy: 0.8267 - val_loss: 1.4290 - val_accuracy: 0.6602
Epoch 82/200
266/266 [==============================] - 2s 6ms/step - loss: 0.6871 - accuracy: 0.8228 - val_loss: 1.2722 - val_accuracy: 0.6669
In [ ]:
_, accuracy = model_report(CNN1_MODEL_OPTIMIZED, CNN1_MODEL_OPTIMIZED_history)
accuracies_opt_Nadam["CNN1"] = accuracy
Test set evaluation metrics
---------------------------
Loss:     1.208
Accuracy: 69.792%
CNN2
In [ ]:
CNN2_MODEL_OPTIMIZED = init_cnn2_model_optimized(summary = True, optimizer = tf.optimizers.Nadam)
CNN2_MODEL_OPTIMIZED_history = train_model(CNN2_MODEL_OPTIMIZED, epochs = 200, callbacks=[callback])
Model: "sequential_16"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
conv2d_16 (Conv2D)           (None, 32, 32, 32)        896       
_________________________________________________________________
batch_normalization_6 (Batch (None, 32, 32, 32)        128       
_________________________________________________________________
re_lu_6 (ReLU)               (None, 32, 32, 32)        0         
_________________________________________________________________
max_pooling2d_11 (MaxPooling (None, 16, 16, 32)        0         
_________________________________________________________________
dropout_18 (Dropout)         (None, 16, 16, 32)        0         
_________________________________________________________________
conv2d_17 (Conv2D)           (None, 16, 16, 64)        18496     
_________________________________________________________________
batch_normalization_7 (Batch (None, 16, 16, 64)        256       
_________________________________________________________________
re_lu_7 (ReLU)               (None, 16, 16, 64)        0         
_________________________________________________________________
max_pooling2d_12 (MaxPooling (None, 8, 8, 64)          0         
_________________________________________________________________
dropout_19 (Dropout)         (None, 8, 8, 64)          0         
_________________________________________________________________
conv2d_18 (Conv2D)           (None, 8, 8, 128)         73856     
_________________________________________________________________
batch_normalization_8 (Batch (None, 8, 8, 128)         512       
_________________________________________________________________
re_lu_8 (ReLU)               (None, 8, 8, 128)         0         
_________________________________________________________________
max_pooling2d_13 (MaxPooling (None, 4, 4, 128)         0         
_________________________________________________________________
dropout_20 (Dropout)         (None, 4, 4, 128)         0         
_________________________________________________________________
conv2d_19 (Conv2D)           (None, 4, 4, 256)         295168    
_________________________________________________________________
batch_normalization_9 (Batch (None, 4, 4, 256)         1024      
_________________________________________________________________
re_lu_9 (ReLU)               (None, 4, 4, 256)         0         
_________________________________________________________________
dropout_21 (Dropout)         (None, 4, 4, 256)         0         
_________________________________________________________________
flatten_5 (Flatten)          (None, 4096)              0         
_________________________________________________________________
dense_22 (Dense)             (None, 512)               2097664   
_________________________________________________________________
dropout_22 (Dropout)         (None, 512)               0         
_________________________________________________________________
dense_23 (Dense)             (None, 20)                10260     
=================================================================
Total params: 2,498,260
Trainable params: 2,497,300
Non-trainable params: 960
_________________________________________________________________
Epoch 1/200
266/266 [==============================] - 4s 9ms/step - loss: 6.0009 - accuracy: 0.1140 - val_loss: 6.4798 - val_accuracy: 0.0492
Epoch 2/200
266/266 [==============================] - 2s 8ms/step - loss: 5.2796 - accuracy: 0.2397 - val_loss: 5.5469 - val_accuracy: 0.1556
Epoch 3/200
266/266 [==============================] - 2s 8ms/step - loss: 4.8891 - accuracy: 0.2958 - val_loss: 4.8178 - val_accuracy: 0.2726
Epoch 4/200
266/266 [==============================] - 2s 8ms/step - loss: 4.5278 - accuracy: 0.3340 - val_loss: 4.6121 - val_accuracy: 0.2959
Epoch 5/200
266/266 [==============================] - 2s 8ms/step - loss: 4.1858 - accuracy: 0.3829 - val_loss: 4.3211 - val_accuracy: 0.3265
Epoch 6/200
266/266 [==============================] - 2s 8ms/step - loss: 3.9583 - accuracy: 0.4052 - val_loss: 4.0680 - val_accuracy: 0.3531
Epoch 7/200
266/266 [==============================] - 2s 8ms/step - loss: 3.6458 - accuracy: 0.4505 - val_loss: 4.1784 - val_accuracy: 0.3198
Epoch 8/200
266/266 [==============================] - 2s 8ms/step - loss: 3.4373 - accuracy: 0.4690 - val_loss: 3.6599 - val_accuracy: 0.4089
Epoch 9/200
266/266 [==============================] - 2s 8ms/step - loss: 3.2234 - accuracy: 0.4941 - val_loss: 3.6660 - val_accuracy: 0.3949
Epoch 10/200
266/266 [==============================] - 2s 8ms/step - loss: 3.0636 - accuracy: 0.5056 - val_loss: 3.6183 - val_accuracy: 0.3856
Epoch 11/200
266/266 [==============================] - 2s 8ms/step - loss: 2.8537 - accuracy: 0.5384 - val_loss: 3.2127 - val_accuracy: 0.4315
Epoch 12/200
266/266 [==============================] - 2s 8ms/step - loss: 2.7136 - accuracy: 0.5489 - val_loss: 3.1396 - val_accuracy: 0.4601
Epoch 13/200
266/266 [==============================] - 2s 8ms/step - loss: 2.5978 - accuracy: 0.5589 - val_loss: 2.9790 - val_accuracy: 0.4721
Epoch 14/200
266/266 [==============================] - 2s 8ms/step - loss: 2.4443 - accuracy: 0.5809 - val_loss: 3.0069 - val_accuracy: 0.4508
Epoch 15/200
266/266 [==============================] - 2s 8ms/step - loss: 2.3407 - accuracy: 0.5907 - val_loss: 2.7977 - val_accuracy: 0.4820
Epoch 16/200
266/266 [==============================] - 2s 8ms/step - loss: 2.2239 - accuracy: 0.6044 - val_loss: 3.0752 - val_accuracy: 0.4328
Epoch 17/200
266/266 [==============================] - 2s 8ms/step - loss: 2.1472 - accuracy: 0.6134 - val_loss: 2.5607 - val_accuracy: 0.5133
Epoch 18/200
266/266 [==============================] - 2s 8ms/step - loss: 2.0314 - accuracy: 0.6316 - val_loss: 2.3556 - val_accuracy: 0.5465
Epoch 19/200
266/266 [==============================] - 2s 8ms/step - loss: 1.9606 - accuracy: 0.6390 - val_loss: 2.4043 - val_accuracy: 0.5392
Epoch 20/200
266/266 [==============================] - 2s 8ms/step - loss: 1.8397 - accuracy: 0.6619 - val_loss: 2.3675 - val_accuracy: 0.5306
Epoch 21/200
266/266 [==============================] - 2s 8ms/step - loss: 1.7604 - accuracy: 0.6744 - val_loss: 2.3881 - val_accuracy: 0.5226
Epoch 22/200
266/266 [==============================] - 2s 8ms/step - loss: 1.6981 - accuracy: 0.6766 - val_loss: 2.3288 - val_accuracy: 0.5366
Epoch 23/200
266/266 [==============================] - 2s 8ms/step - loss: 1.6417 - accuracy: 0.6916 - val_loss: 2.1437 - val_accuracy: 0.5725
Epoch 24/200
266/266 [==============================] - 2s 8ms/step - loss: 1.5933 - accuracy: 0.6984 - val_loss: 2.1081 - val_accuracy: 0.5658
Epoch 25/200
266/266 [==============================] - 2s 8ms/step - loss: 1.5259 - accuracy: 0.7045 - val_loss: 1.9719 - val_accuracy: 0.5957
Epoch 26/200
266/266 [==============================] - 2s 8ms/step - loss: 1.4648 - accuracy: 0.7126 - val_loss: 2.1705 - val_accuracy: 0.5532
Epoch 27/200
266/266 [==============================] - 2s 8ms/step - loss: 1.3919 - accuracy: 0.7318 - val_loss: 1.9223 - val_accuracy: 0.6090
Epoch 28/200
266/266 [==============================] - 2s 8ms/step - loss: 1.3543 - accuracy: 0.7409 - val_loss: 1.9500 - val_accuracy: 0.5951
Epoch 29/200
266/266 [==============================] - 2s 8ms/step - loss: 1.3002 - accuracy: 0.7426 - val_loss: 1.9962 - val_accuracy: 0.5984
Epoch 30/200
266/266 [==============================] - 2s 8ms/step - loss: 1.2520 - accuracy: 0.7475 - val_loss: 1.8854 - val_accuracy: 0.6144
Epoch 31/200
266/266 [==============================] - 2s 8ms/step - loss: 1.2067 - accuracy: 0.7580 - val_loss: 1.7771 - val_accuracy: 0.6277
Epoch 32/200
266/266 [==============================] - 2s 8ms/step - loss: 1.1817 - accuracy: 0.7659 - val_loss: 1.9029 - val_accuracy: 0.6051
Epoch 33/200
266/266 [==============================] - 2s 8ms/step - loss: 1.1225 - accuracy: 0.7848 - val_loss: 1.7501 - val_accuracy: 0.6277
Epoch 34/200
266/266 [==============================] - 2s 8ms/step - loss: 1.1032 - accuracy: 0.7917 - val_loss: 1.8947 - val_accuracy: 0.5891
Epoch 35/200
266/266 [==============================] - 2s 8ms/step - loss: 1.0658 - accuracy: 0.7908 - val_loss: 1.6773 - val_accuracy: 0.6516
Epoch 36/200
266/266 [==============================] - 2s 8ms/step - loss: 1.0372 - accuracy: 0.7969 - val_loss: 1.7537 - val_accuracy: 0.6263
Epoch 37/200
266/266 [==============================] - 2s 8ms/step - loss: 1.0140 - accuracy: 0.7968 - val_loss: 1.8384 - val_accuracy: 0.6044
Epoch 38/200
266/266 [==============================] - 2s 8ms/step - loss: 0.9810 - accuracy: 0.8096 - val_loss: 1.7246 - val_accuracy: 0.6316
Epoch 39/200
266/266 [==============================] - 2s 8ms/step - loss: 0.9514 - accuracy: 0.8102 - val_loss: 1.5284 - val_accuracy: 0.6789
Epoch 40/200
266/266 [==============================] - 2s 8ms/step - loss: 0.9220 - accuracy: 0.8178 - val_loss: 1.5965 - val_accuracy: 0.6689
Epoch 41/200
266/266 [==============================] - 2s 8ms/step - loss: 0.8882 - accuracy: 0.8296 - val_loss: 1.6787 - val_accuracy: 0.6403
Epoch 42/200
266/266 [==============================] - 2s 8ms/step - loss: 0.8483 - accuracy: 0.8422 - val_loss: 1.6833 - val_accuracy: 0.6456
Epoch 43/200
266/266 [==============================] - 2s 8ms/step - loss: 0.8436 - accuracy: 0.8411 - val_loss: 1.5824 - val_accuracy: 0.6755
Epoch 44/200
266/266 [==============================] - 2s 8ms/step - loss: 0.8318 - accuracy: 0.8405 - val_loss: 1.6907 - val_accuracy: 0.6330
Epoch 45/200
266/266 [==============================] - 2s 8ms/step - loss: 0.7941 - accuracy: 0.8473 - val_loss: 1.7897 - val_accuracy: 0.6263
Epoch 46/200
266/266 [==============================] - 2s 8ms/step - loss: 0.7937 - accuracy: 0.8505 - val_loss: 1.5604 - val_accuracy: 0.6562
Epoch 47/200
266/266 [==============================] - 2s 8ms/step - loss: 0.7668 - accuracy: 0.8506 - val_loss: 1.6361 - val_accuracy: 0.6536
Epoch 48/200
266/266 [==============================] - 2s 8ms/step - loss: 0.7484 - accuracy: 0.8571 - val_loss: 1.5374 - val_accuracy: 0.6742
Epoch 49/200
266/266 [==============================] - 2s 8ms/step - loss: 0.7379 - accuracy: 0.8642 - val_loss: 1.4696 - val_accuracy: 0.6875
Epoch 50/200
266/266 [==============================] - 2s 8ms/step - loss: 0.7127 - accuracy: 0.8680 - val_loss: 1.5732 - val_accuracy: 0.6576
Epoch 51/200
266/266 [==============================] - 2s 8ms/step - loss: 0.6866 - accuracy: 0.8677 - val_loss: 1.5991 - val_accuracy: 0.6702
Epoch 52/200
266/266 [==============================] - 2s 8ms/step - loss: 0.6546 - accuracy: 0.8866 - val_loss: 1.4911 - val_accuracy: 0.6882
Epoch 53/200
266/266 [==============================] - 2s 8ms/step - loss: 0.6785 - accuracy: 0.8752 - val_loss: 1.5360 - val_accuracy: 0.6729
Epoch 54/200
266/266 [==============================] - 2s 8ms/step - loss: 0.6602 - accuracy: 0.8815 - val_loss: 1.5729 - val_accuracy: 0.6722
Epoch 55/200
266/266 [==============================] - 2s 8ms/step - loss: 0.6438 - accuracy: 0.8832 - val_loss: 1.5409 - val_accuracy: 0.6749
Epoch 56/200
266/266 [==============================] - 2s 8ms/step - loss: 0.6495 - accuracy: 0.8847 - val_loss: 1.4148 - val_accuracy: 0.7094
Epoch 57/200
266/266 [==============================] - 2s 8ms/step - loss: 0.6149 - accuracy: 0.8913 - val_loss: 1.4671 - val_accuracy: 0.6815
Epoch 58/200
266/266 [==============================] - 2s 8ms/step - loss: 0.6264 - accuracy: 0.8843 - val_loss: 1.4897 - val_accuracy: 0.6802
Epoch 59/200
266/266 [==============================] - 2s 8ms/step - loss: 0.5880 - accuracy: 0.8979 - val_loss: 1.6032 - val_accuracy: 0.6629
Epoch 60/200
266/266 [==============================] - 2s 8ms/step - loss: 0.5981 - accuracy: 0.8879 - val_loss: 1.4772 - val_accuracy: 0.6749
Epoch 61/200
266/266 [==============================] - 2s 8ms/step - loss: 0.5737 - accuracy: 0.8999 - val_loss: 1.5735 - val_accuracy: 0.6722
Epoch 62/200
266/266 [==============================] - 2s 8ms/step - loss: 0.5585 - accuracy: 0.8982 - val_loss: 1.7457 - val_accuracy: 0.6383
Epoch 63/200
266/266 [==============================] - 2s 8ms/step - loss: 0.5542 - accuracy: 0.9056 - val_loss: 1.4601 - val_accuracy: 0.6875
Epoch 64/200
266/266 [==============================] - 2s 8ms/step - loss: 0.5467 - accuracy: 0.9000 - val_loss: 1.5648 - val_accuracy: 0.6828
Epoch 65/200
266/266 [==============================] - 2s 8ms/step - loss: 0.5537 - accuracy: 0.9029 - val_loss: 1.4200 - val_accuracy: 0.6968
Epoch 66/200
266/266 [==============================] - 2s 8ms/step - loss: 0.5152 - accuracy: 0.9138 - val_loss: 1.5759 - val_accuracy: 0.6709
Epoch 67/200
266/266 [==============================] - 2s 8ms/step - loss: 0.5228 - accuracy: 0.9107 - val_loss: 1.5266 - val_accuracy: 0.6875
Epoch 68/200
266/266 [==============================] - 2s 8ms/step - loss: 0.5265 - accuracy: 0.9094 - val_loss: 1.7001 - val_accuracy: 0.6662
Epoch 69/200
266/266 [==============================] - 2s 8ms/step - loss: 0.5262 - accuracy: 0.9051 - val_loss: 1.6977 - val_accuracy: 0.6509
Epoch 70/200
266/266 [==============================] - 2s 8ms/step - loss: 0.5149 - accuracy: 0.9114 - val_loss: 1.5512 - val_accuracy: 0.6762
Epoch 71/200
266/266 [==============================] - 2s 8ms/step - loss: 0.5097 - accuracy: 0.9126 - val_loss: 1.4607 - val_accuracy: 0.6908
Epoch 72/200
266/266 [==============================] - 2s 8ms/step - loss: 0.4785 - accuracy: 0.9234 - val_loss: 1.6439 - val_accuracy: 0.6676
Epoch 73/200
266/266 [==============================] - 2s 8ms/step - loss: 0.4762 - accuracy: 0.9231 - val_loss: 1.5773 - val_accuracy: 0.6828
Epoch 74/200
266/266 [==============================] - 2s 8ms/step - loss: 0.4785 - accuracy: 0.9217 - val_loss: 1.5609 - val_accuracy: 0.6742
Epoch 75/200
266/266 [==============================] - 2s 8ms/step - loss: 0.4661 - accuracy: 0.9274 - val_loss: 1.4956 - val_accuracy: 0.6948
Epoch 76/200
266/266 [==============================] - 2s 8ms/step - loss: 0.4722 - accuracy: 0.9215 - val_loss: 1.4886 - val_accuracy: 0.7028
In [ ]:
_, accuracy = model_report(CNN2_MODEL_OPTIMIZED, CNN2_MODEL_OPTIMIZED_history)
accuracies_opt_Nadam["CNN2"] = accuracy
Test set evaluation metrics
---------------------------
Loss:     1.394
Accuracy: 70.139%

Μεταφορά μάθησης

VGG16
In [ ]:
VGG16_MODEL_OPTIMIZED = init_VGG16_model_optimized(True, optimizer = tf.optimizers.Nadam)
VGG16_MODEL_OPTIMIZED_history = train_model(VGG16_MODEL_OPTIMIZED, epochs = 200, callbacks = [callback])
Model: "sequential_17"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
vgg16 (Functional)           (None, 1, 1, 512)         14714688  
_________________________________________________________________
dropout_23 (Dropout)         (None, 1, 1, 512)         0         
_________________________________________________________________
global_average_pooling2d_11  (None, 512)               0         
_________________________________________________________________
dense_24 (Dense)             (None, 20)                10260     
=================================================================
Total params: 14,724,948
Trainable params: 14,724,948
Non-trainable params: 0
_________________________________________________________________
Epoch 1/200
266/266 [==============================] - 13s 40ms/step - loss: 2.4992 - accuracy: 0.2594 - val_loss: 1.1534 - val_accuracy: 0.6449
Epoch 2/200
266/266 [==============================] - 10s 39ms/step - loss: 1.1628 - accuracy: 0.6615 - val_loss: 1.1550 - val_accuracy: 0.6769
Epoch 3/200
266/266 [==============================] - 10s 39ms/step - loss: 0.7917 - accuracy: 0.7693 - val_loss: 0.8458 - val_accuracy: 0.7566
Epoch 4/200
266/266 [==============================] - 10s 39ms/step - loss: 0.4995 - accuracy: 0.8565 - val_loss: 0.8292 - val_accuracy: 0.7560
Epoch 5/200
266/266 [==============================] - 10s 39ms/step - loss: 0.3325 - accuracy: 0.9040 - val_loss: 1.0283 - val_accuracy: 0.7533
Epoch 6/200
266/266 [==============================] - 10s 39ms/step - loss: 0.2555 - accuracy: 0.9279 - val_loss: 0.9739 - val_accuracy: 0.7620
Epoch 7/200
266/266 [==============================] - 10s 39ms/step - loss: 0.1787 - accuracy: 0.9476 - val_loss: 1.0663 - val_accuracy: 0.7420
Epoch 8/200
266/266 [==============================] - 10s 39ms/step - loss: 0.1195 - accuracy: 0.9617 - val_loss: 1.1742 - val_accuracy: 0.7394
Epoch 9/200
266/266 [==============================] - 10s 39ms/step - loss: 0.1477 - accuracy: 0.9600 - val_loss: 0.9961 - val_accuracy: 0.7879
Epoch 10/200
266/266 [==============================] - 10s 39ms/step - loss: 0.0691 - accuracy: 0.9801 - val_loss: 1.0753 - val_accuracy: 0.7706
Epoch 11/200
266/266 [==============================] - 10s 39ms/step - loss: 0.1062 - accuracy: 0.9707 - val_loss: 1.1454 - val_accuracy: 0.7666
Epoch 12/200
266/266 [==============================] - 10s 39ms/step - loss: 0.0762 - accuracy: 0.9775 - val_loss: 1.0813 - val_accuracy: 0.7746
Epoch 13/200
266/266 [==============================] - 10s 39ms/step - loss: 0.0533 - accuracy: 0.9843 - val_loss: 1.2419 - val_accuracy: 0.7460
Epoch 14/200
266/266 [==============================] - 10s 39ms/step - loss: 0.0700 - accuracy: 0.9793 - val_loss: 1.1982 - val_accuracy: 0.7580
Epoch 15/200
266/266 [==============================] - 10s 39ms/step - loss: 0.0589 - accuracy: 0.9830 - val_loss: 1.2512 - val_accuracy: 0.7640
Epoch 16/200
266/266 [==============================] - 10s 39ms/step - loss: 0.0728 - accuracy: 0.9797 - val_loss: 1.2197 - val_accuracy: 0.7646
Epoch 17/200
266/266 [==============================] - 10s 39ms/step - loss: 0.0250 - accuracy: 0.9925 - val_loss: 1.3183 - val_accuracy: 0.7600
Epoch 18/200
266/266 [==============================] - 10s 39ms/step - loss: 0.0918 - accuracy: 0.9737 - val_loss: 1.1729 - val_accuracy: 0.7640
Epoch 19/200
266/266 [==============================] - 10s 39ms/step - loss: 0.0516 - accuracy: 0.9870 - val_loss: 1.2972 - val_accuracy: 0.7586
Epoch 20/200
266/266 [==============================] - 10s 39ms/step - loss: 0.0635 - accuracy: 0.9821 - val_loss: 1.3432 - val_accuracy: 0.7493
Epoch 21/200
266/266 [==============================] - 10s 39ms/step - loss: 0.0495 - accuracy: 0.9867 - val_loss: 1.1844 - val_accuracy: 0.7660
Epoch 22/200
266/266 [==============================] - 10s 39ms/step - loss: 0.0672 - accuracy: 0.9799 - val_loss: 1.1230 - val_accuracy: 0.7726
Epoch 23/200
266/266 [==============================] - 10s 39ms/step - loss: 0.0349 - accuracy: 0.9907 - val_loss: 1.4053 - val_accuracy: 0.7586
Epoch 24/200
266/266 [==============================] - 10s 39ms/step - loss: 0.0529 - accuracy: 0.9876 - val_loss: 1.3719 - val_accuracy: 0.7513
In [ ]:
_, accuracy = model_report(VGG16_MODEL_OPTIMIZED, VGG16_MODEL_OPTIMIZED_history)
accuracies_opt_Nadam["VGG_ALL"] = accuracy
Test set evaluation metrics
---------------------------
Loss:     0.840
Accuracy: 75.942%
MobileNet
In [ ]:
MobileNetV2_MODEL_OPTIMIZED = init_MobileNetV2_model_optimized(True, optimizer = tf.optimizers.Nadam)
MobileNetV2_MODEL_OPTIMIZED_history = train_model(MobileNetV2_MODEL_OPTIMIZED, train_dataset = train_ds_res, validation_dataset = validation_ds_res, epochs = 200, callbacks=[callback])
Model: "sequential_18"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
mobilenetv2_1.00_224 (Functi (None, 7, 7, 1280)        2257984   
_________________________________________________________________
dropout_24 (Dropout)         (None, 7, 7, 1280)        0         
_________________________________________________________________
global_average_pooling2d_12  (None, 1280)              0         
_________________________________________________________________
dense_25 (Dense)             (None, 20)                25620     
=================================================================
Total params: 2,283,604
Trainable params: 2,249,492
Non-trainable params: 34,112
_________________________________________________________________
Epoch 1/200
266/266 [==============================] - 73s 235ms/step - loss: 1.7689 - accuracy: 0.4998 - val_loss: 2.2874 - val_accuracy: 0.3943
Epoch 2/200
266/266 [==============================] - 62s 233ms/step - loss: 0.3508 - accuracy: 0.8965 - val_loss: 2.0463 - val_accuracy: 0.4548
Epoch 3/200
266/266 [==============================] - 62s 233ms/step - loss: 0.1492 - accuracy: 0.9595 - val_loss: 2.1907 - val_accuracy: 0.4568
Epoch 4/200
266/266 [==============================] - 62s 234ms/step - loss: 0.0798 - accuracy: 0.9788 - val_loss: 2.6748 - val_accuracy: 0.3830
Epoch 5/200
266/266 [==============================] - 62s 233ms/step - loss: 0.0426 - accuracy: 0.9914 - val_loss: 2.8211 - val_accuracy: 0.3710
Epoch 6/200
266/266 [==============================] - 62s 234ms/step - loss: 0.0296 - accuracy: 0.9943 - val_loss: 2.5179 - val_accuracy: 0.4535
Epoch 7/200
266/266 [==============================] - 61s 231ms/step - loss: 0.0317 - accuracy: 0.9921 - val_loss: 2.3216 - val_accuracy: 0.4887
Epoch 8/200
266/266 [==============================] - 62s 232ms/step - loss: 0.0231 - accuracy: 0.9943 - val_loss: 1.4094 - val_accuracy: 0.6948
Epoch 9/200
266/266 [==============================] - 62s 233ms/step - loss: 0.0219 - accuracy: 0.9947 - val_loss: 1.4736 - val_accuracy: 0.6629
Epoch 10/200
266/266 [==============================] - 62s 232ms/step - loss: 0.0163 - accuracy: 0.9967 - val_loss: 1.0776 - val_accuracy: 0.7626
Epoch 11/200
266/266 [==============================] - 62s 234ms/step - loss: 0.0277 - accuracy: 0.9910 - val_loss: 0.8345 - val_accuracy: 0.8039
Epoch 12/200
266/266 [==============================] - 62s 233ms/step - loss: 0.0384 - accuracy: 0.9899 - val_loss: 0.9899 - val_accuracy: 0.7633
Epoch 13/200
266/266 [==============================] - 62s 233ms/step - loss: 0.0239 - accuracy: 0.9926 - val_loss: 1.4764 - val_accuracy: 0.6862
Epoch 14/200
266/266 [==============================] - 62s 233ms/step - loss: 0.0275 - accuracy: 0.9915 - val_loss: 0.8649 - val_accuracy: 0.8125
Epoch 15/200
266/266 [==============================] - 62s 233ms/step - loss: 0.0288 - accuracy: 0.9912 - val_loss: 1.0999 - val_accuracy: 0.7866
Epoch 16/200
266/266 [==============================] - 62s 233ms/step - loss: 0.0227 - accuracy: 0.9936 - val_loss: 0.6650 - val_accuracy: 0.8484
Epoch 17/200
266/266 [==============================] - 62s 232ms/step - loss: 0.0145 - accuracy: 0.9953 - val_loss: 0.7902 - val_accuracy: 0.8364
Epoch 18/200
266/266 [==============================] - 62s 234ms/step - loss: 0.0188 - accuracy: 0.9933 - val_loss: 0.5882 - val_accuracy: 0.8551
Epoch 19/200
266/266 [==============================] - 62s 233ms/step - loss: 0.0249 - accuracy: 0.9927 - val_loss: 0.5879 - val_accuracy: 0.8604
Epoch 20/200
266/266 [==============================] - 62s 233ms/step - loss: 0.0215 - accuracy: 0.9948 - val_loss: 0.6219 - val_accuracy: 0.8630
Epoch 21/200
266/266 [==============================] - 62s 233ms/step - loss: 0.0119 - accuracy: 0.9965 - val_loss: 0.8136 - val_accuracy: 0.8424
Epoch 22/200
266/266 [==============================] - 62s 234ms/step - loss: 0.0166 - accuracy: 0.9948 - val_loss: 0.6961 - val_accuracy: 0.8398
Epoch 23/200
266/266 [==============================] - 62s 233ms/step - loss: 0.0220 - accuracy: 0.9914 - val_loss: 0.8316 - val_accuracy: 0.8331
Epoch 24/200
266/266 [==============================] - 62s 233ms/step - loss: 0.0238 - accuracy: 0.9928 - val_loss: 0.5831 - val_accuracy: 0.8684
Epoch 25/200
266/266 [==============================] - 62s 234ms/step - loss: 0.0194 - accuracy: 0.9925 - val_loss: 0.8279 - val_accuracy: 0.8424
Epoch 26/200
266/266 [==============================] - 62s 234ms/step - loss: 0.0259 - accuracy: 0.9936 - val_loss: 0.7122 - val_accuracy: 0.8564
Epoch 27/200
266/266 [==============================] - 62s 235ms/step - loss: 0.0209 - accuracy: 0.9933 - val_loss: 0.8142 - val_accuracy: 0.8424
Epoch 28/200
266/266 [==============================] - 62s 234ms/step - loss: 0.0151 - accuracy: 0.9950 - val_loss: 0.7669 - val_accuracy: 0.8511
Epoch 29/200
266/266 [==============================] - 62s 234ms/step - loss: 0.0115 - accuracy: 0.9970 - val_loss: 0.8249 - val_accuracy: 0.8371
Epoch 30/200
266/266 [==============================] - 62s 233ms/step - loss: 0.0111 - accuracy: 0.9974 - val_loss: 0.7745 - val_accuracy: 0.8630
Epoch 31/200
266/266 [==============================] - 62s 233ms/step - loss: 0.0099 - accuracy: 0.9964 - val_loss: 0.6727 - val_accuracy: 0.8737
Epoch 32/200
266/266 [==============================] - 62s 233ms/step - loss: 0.0172 - accuracy: 0.9941 - val_loss: 1.0071 - val_accuracy: 0.8291
Epoch 33/200
266/266 [==============================] - 62s 233ms/step - loss: 0.0235 - accuracy: 0.9941 - val_loss: 0.9708 - val_accuracy: 0.8112
Epoch 34/200
266/266 [==============================] - 62s 233ms/step - loss: 0.0201 - accuracy: 0.9933 - val_loss: 0.7422 - val_accuracy: 0.8531
Epoch 35/200
266/266 [==============================] - 62s 233ms/step - loss: 0.0140 - accuracy: 0.9947 - val_loss: 0.7290 - val_accuracy: 0.8617
Epoch 36/200
266/266 [==============================] - 62s 233ms/step - loss: 0.0116 - accuracy: 0.9971 - val_loss: 0.7593 - val_accuracy: 0.8398
Epoch 37/200
266/266 [==============================] - 62s 233ms/step - loss: 0.0170 - accuracy: 0.9944 - val_loss: 0.8088 - val_accuracy: 0.8471
Epoch 38/200
266/266 [==============================] - 62s 234ms/step - loss: 0.0102 - accuracy: 0.9963 - val_loss: 0.7172 - val_accuracy: 0.8597
Epoch 39/200
266/266 [==============================] - 62s 234ms/step - loss: 0.0144 - accuracy: 0.9950 - val_loss: 0.7553 - val_accuracy: 0.8404
Epoch 40/200
266/266 [==============================] - 62s 234ms/step - loss: 0.0186 - accuracy: 0.9946 - val_loss: 0.6584 - val_accuracy: 0.8684
Epoch 41/200
266/266 [==============================] - 62s 234ms/step - loss: 0.0073 - accuracy: 0.9975 - val_loss: 0.5908 - val_accuracy: 0.8750
Epoch 42/200
266/266 [==============================] - 62s 233ms/step - loss: 0.0118 - accuracy: 0.9963 - val_loss: 0.7816 - val_accuracy: 0.8597
Epoch 43/200
266/266 [==============================] - 62s 234ms/step - loss: 0.0148 - accuracy: 0.9967 - val_loss: 0.6241 - val_accuracy: 0.8830
Epoch 44/200
266/266 [==============================] - 62s 233ms/step - loss: 0.0116 - accuracy: 0.9972 - val_loss: 0.6768 - val_accuracy: 0.8790
In [ ]:
_, accuracy = model_report(MobileNetV2_MODEL_OPTIMIZED, MobileNetV2_MODEL_OPTIMIZED_history, test_ds_res)
accuracies_opt_Nadam["MOBILENET_ALL"] = accuracy
Test set evaluation metrics
---------------------------
Loss:     0.573
Accuracy: 86.210%
DenseNet
In [ ]:
DENSENET_MODEL_OPTIMIZED = init_DENSENET_model_optimized(True, optimizer = tf.optimizers.Nadam)
DENSENET_MODEL_OPTIMIZED_history = train_model(DENSENET_MODEL_OPTIMIZED, epochs = 200, callbacks=[callback])
Model: "sequential_19"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
densenet121 (Functional)     (None, 1, 1, 1024)        7037504   
_________________________________________________________________
dropout_25 (Dropout)         (None, 1, 1, 1024)        0         
_________________________________________________________________
global_average_pooling2d_13  (None, 1024)              0         
_________________________________________________________________
dense_26 (Dense)             (None, 20)                20500     
=================================================================
Total params: 7,058,004
Trainable params: 6,974,356
Non-trainable params: 83,648
_________________________________________________________________
Epoch 1/200
266/266 [==============================] - 44s 70ms/step - loss: 3.6886 - accuracy: 0.1368 - val_loss: 1.9870 - val_accuracy: 0.4894
Epoch 2/200
266/266 [==============================] - 17s 62ms/step - loss: 1.8614 - accuracy: 0.4662 - val_loss: 1.2792 - val_accuracy: 0.6443
Epoch 3/200
266/266 [==============================] - 17s 62ms/step - loss: 1.2915 - accuracy: 0.6251 - val_loss: 1.0340 - val_accuracy: 0.7001
Epoch 4/200
266/266 [==============================] - 16s 62ms/step - loss: 1.0103 - accuracy: 0.6968 - val_loss: 0.9451 - val_accuracy: 0.7214
Epoch 5/200
266/266 [==============================] - 16s 62ms/step - loss: 0.7685 - accuracy: 0.7640 - val_loss: 0.8984 - val_accuracy: 0.7420
Epoch 6/200
266/266 [==============================] - 17s 62ms/step - loss: 0.6255 - accuracy: 0.8103 - val_loss: 0.8568 - val_accuracy: 0.7606
Epoch 7/200
266/266 [==============================] - 17s 63ms/step - loss: 0.5334 - accuracy: 0.8294 - val_loss: 0.8694 - val_accuracy: 0.7620
Epoch 8/200
266/266 [==============================] - 16s 62ms/step - loss: 0.4056 - accuracy: 0.8724 - val_loss: 0.8904 - val_accuracy: 0.7593
Epoch 9/200
266/266 [==============================] - 17s 62ms/step - loss: 0.3155 - accuracy: 0.9041 - val_loss: 0.8563 - val_accuracy: 0.7633
Epoch 10/200
266/266 [==============================] - 16s 61ms/step - loss: 0.2433 - accuracy: 0.9260 - val_loss: 0.8958 - val_accuracy: 0.7646
Epoch 11/200
266/266 [==============================] - 17s 62ms/step - loss: 0.2232 - accuracy: 0.9305 - val_loss: 0.9351 - val_accuracy: 0.7660
Epoch 12/200
266/266 [==============================] - 17s 62ms/step - loss: 0.1993 - accuracy: 0.9429 - val_loss: 0.9684 - val_accuracy: 0.7699
Epoch 13/200
266/266 [==============================] - 16s 62ms/step - loss: 0.1608 - accuracy: 0.9538 - val_loss: 0.9978 - val_accuracy: 0.7779
Epoch 14/200
266/266 [==============================] - 17s 62ms/step - loss: 0.1486 - accuracy: 0.9523 - val_loss: 1.0096 - val_accuracy: 0.7693
Epoch 15/200
266/266 [==============================] - 17s 62ms/step - loss: 0.1600 - accuracy: 0.9473 - val_loss: 0.9524 - val_accuracy: 0.7812
Epoch 16/200
266/266 [==============================] - 17s 62ms/step - loss: 0.1260 - accuracy: 0.9630 - val_loss: 0.9850 - val_accuracy: 0.7739
Epoch 17/200
266/266 [==============================] - 17s 63ms/step - loss: 0.0987 - accuracy: 0.9666 - val_loss: 0.9718 - val_accuracy: 0.7753
Epoch 18/200
266/266 [==============================] - 17s 62ms/step - loss: 0.1031 - accuracy: 0.9673 - val_loss: 1.0028 - val_accuracy: 0.7693
Epoch 19/200
266/266 [==============================] - 17s 62ms/step - loss: 0.1112 - accuracy: 0.9649 - val_loss: 1.0077 - val_accuracy: 0.7759
Epoch 20/200
266/266 [==============================] - 16s 62ms/step - loss: 0.1060 - accuracy: 0.9654 - val_loss: 1.0425 - val_accuracy: 0.7793
Epoch 21/200
266/266 [==============================] - 17s 62ms/step - loss: 0.1087 - accuracy: 0.9684 - val_loss: 0.9948 - val_accuracy: 0.7832
Epoch 22/200
266/266 [==============================] - 17s 63ms/step - loss: 0.0874 - accuracy: 0.9738 - val_loss: 1.0264 - val_accuracy: 0.7872
Epoch 23/200
266/266 [==============================] - 16s 62ms/step - loss: 0.0971 - accuracy: 0.9691 - val_loss: 1.1035 - val_accuracy: 0.7739
Epoch 24/200
266/266 [==============================] - 17s 62ms/step - loss: 0.0823 - accuracy: 0.9769 - val_loss: 0.9885 - val_accuracy: 0.7859
Epoch 25/200
266/266 [==============================] - 17s 62ms/step - loss: 0.0642 - accuracy: 0.9789 - val_loss: 1.0716 - val_accuracy: 0.7626
Epoch 26/200
266/266 [==============================] - 17s 63ms/step - loss: 0.0739 - accuracy: 0.9802 - val_loss: 1.0106 - val_accuracy: 0.7839
Epoch 27/200
266/266 [==============================] - 17s 63ms/step - loss: 0.0774 - accuracy: 0.9779 - val_loss: 1.0571 - val_accuracy: 0.7666
Epoch 28/200
266/266 [==============================] - 17s 63ms/step - loss: 0.0715 - accuracy: 0.9752 - val_loss: 1.0117 - val_accuracy: 0.7879
Epoch 29/200
266/266 [==============================] - 17s 63ms/step - loss: 0.0596 - accuracy: 0.9811 - val_loss: 0.9439 - val_accuracy: 0.7859
In [ ]:
_, accuracy = model_report(DENSENET_MODEL_OPTIMIZED, DENSENET_MODEL_OPTIMIZED_history)
accuracies_opt_Nadam["DENSENET_ALL"] = accuracy
Test set evaluation metrics
---------------------------
Loss:     0.883
Accuracy: 75.347%

SGD

Δίκτυα "from scratch"

In [ ]:
accuracies_opt_SGD = {}
Simple CNN
In [ ]:
SIMPLE_MODEL_OPTIMIZED = init_simple_model_optimized(summary = True, optimizer = tf.optimizers.SGD)
SIMPLE_MODEL_OPTIMIZED_history = train_model(SIMPLE_MODEL_OPTIMIZED, epochs = 200, callbacks=[callback])
Model: "sequential_20"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
conv2d_20 (Conv2D)           (None, 30, 30, 32)        896       
_________________________________________________________________
batch_normalization_10 (Batc (None, 30, 30, 32)        128       
_________________________________________________________________
re_lu_10 (ReLU)              (None, 30, 30, 32)        0         
_________________________________________________________________
max_pooling2d_14 (MaxPooling (None, 15, 15, 32)        0         
_________________________________________________________________
dropout_26 (Dropout)         (None, 15, 15, 32)        0         
_________________________________________________________________
conv2d_21 (Conv2D)           (None, 13, 13, 64)        18496     
_________________________________________________________________
batch_normalization_11 (Batc (None, 13, 13, 64)        256       
_________________________________________________________________
re_lu_11 (ReLU)              (None, 13, 13, 64)        0         
_________________________________________________________________
max_pooling2d_15 (MaxPooling (None, 6, 6, 64)          0         
_________________________________________________________________
dropout_27 (Dropout)         (None, 6, 6, 64)          0         
_________________________________________________________________
conv2d_22 (Conv2D)           (None, 4, 4, 64)          36928     
_________________________________________________________________
batch_normalization_12 (Batc (None, 4, 4, 64)          256       
_________________________________________________________________
re_lu_12 (ReLU)              (None, 4, 4, 64)          0         
_________________________________________________________________
flatten_6 (Flatten)          (None, 1024)              0         
_________________________________________________________________
dropout_28 (Dropout)         (None, 1024)              0         
_________________________________________________________________
dense_27 (Dense)             (None, 64)                65600     
_________________________________________________________________
dense_28 (Dense)             (None, 20)                1300      
=================================================================
Total params: 123,860
Trainable params: 123,540
Non-trainable params: 320
_________________________________________________________________
Epoch 1/200
266/266 [==============================] - 2s 5ms/step - loss: 4.5542 - accuracy: 0.0585 - val_loss: 4.2280 - val_accuracy: 0.0472
Epoch 2/200
266/266 [==============================] - 1s 4ms/step - loss: 4.4910 - accuracy: 0.0476 - val_loss: 4.1783 - val_accuracy: 0.0532
Epoch 3/200
266/266 [==============================] - 1s 4ms/step - loss: 4.4127 - accuracy: 0.0548 - val_loss: 4.1440 - val_accuracy: 0.0751
Epoch 4/200
266/266 [==============================] - 1s 4ms/step - loss: 4.3706 - accuracy: 0.0631 - val_loss: 4.1114 - val_accuracy: 0.0918
Epoch 5/200
266/266 [==============================] - 1s 4ms/step - loss: 4.3270 - accuracy: 0.0649 - val_loss: 4.0812 - val_accuracy: 0.1004
Epoch 6/200
266/266 [==============================] - 1s 4ms/step - loss: 4.2686 - accuracy: 0.0749 - val_loss: 4.0546 - val_accuracy: 0.1104
Epoch 7/200
266/266 [==============================] - 1s 4ms/step - loss: 4.2739 - accuracy: 0.0766 - val_loss: 4.0329 - val_accuracy: 0.1237
Epoch 8/200
266/266 [==============================] - 1s 4ms/step - loss: 4.2304 - accuracy: 0.0793 - val_loss: 4.0141 - val_accuracy: 0.1283
Epoch 9/200
266/266 [==============================] - 1s 4ms/step - loss: 4.2098 - accuracy: 0.0861 - val_loss: 3.9960 - val_accuracy: 0.1396
Epoch 10/200
266/266 [==============================] - 1s 4ms/step - loss: 4.1926 - accuracy: 0.0955 - val_loss: 3.9777 - val_accuracy: 0.1463
Epoch 11/200
266/266 [==============================] - 1s 4ms/step - loss: 4.1570 - accuracy: 0.0951 - val_loss: 3.9636 - val_accuracy: 0.1496
Epoch 12/200
266/266 [==============================] - 1s 4ms/step - loss: 4.1319 - accuracy: 0.1001 - val_loss: 3.9497 - val_accuracy: 0.1543
Epoch 13/200
266/266 [==============================] - 1s 4ms/step - loss: 4.1283 - accuracy: 0.0959 - val_loss: 3.9377 - val_accuracy: 0.1602
Epoch 14/200
266/266 [==============================] - 1s 4ms/step - loss: 4.0895 - accuracy: 0.1068 - val_loss: 3.9231 - val_accuracy: 0.1676
Epoch 15/200
266/266 [==============================] - 1s 4ms/step - loss: 4.0753 - accuracy: 0.1067 - val_loss: 3.9125 - val_accuracy: 0.1682
Epoch 16/200
266/266 [==============================] - 1s 4ms/step - loss: 4.0870 - accuracy: 0.1040 - val_loss: 3.9014 - val_accuracy: 0.1682
Epoch 17/200
266/266 [==============================] - 1s 4ms/step - loss: 4.0575 - accuracy: 0.1092 - val_loss: 3.8915 - val_accuracy: 0.1749
Epoch 18/200
266/266 [==============================] - 1s 4ms/step - loss: 4.0591 - accuracy: 0.1104 - val_loss: 3.8804 - val_accuracy: 0.1715
Epoch 19/200
266/266 [==============================] - 1s 4ms/step - loss: 4.0614 - accuracy: 0.1095 - val_loss: 3.8722 - val_accuracy: 0.1755
Epoch 20/200
266/266 [==============================] - 1s 4ms/step - loss: 4.0224 - accuracy: 0.1257 - val_loss: 3.8632 - val_accuracy: 0.1775
Epoch 21/200
266/266 [==============================] - 1s 4ms/step - loss: 4.0190 - accuracy: 0.1246 - val_loss: 3.8543 - val_accuracy: 0.1795
Epoch 22/200
266/266 [==============================] - 1s 4ms/step - loss: 3.9970 - accuracy: 0.1215 - val_loss: 3.8478 - val_accuracy: 0.1828
Epoch 23/200
266/266 [==============================] - 1s 4ms/step - loss: 3.9956 - accuracy: 0.1191 - val_loss: 3.8375 - val_accuracy: 0.1888
Epoch 24/200
266/266 [==============================] - 1s 4ms/step - loss: 3.9780 - accuracy: 0.1277 - val_loss: 3.8316 - val_accuracy: 0.1908
Epoch 25/200
266/266 [==============================] - 1s 4ms/step - loss: 3.9860 - accuracy: 0.1259 - val_loss: 3.8224 - val_accuracy: 0.1941
Epoch 26/200
266/266 [==============================] - 1s 4ms/step - loss: 3.9734 - accuracy: 0.1304 - val_loss: 3.8149 - val_accuracy: 0.1968
Epoch 27/200
266/266 [==============================] - 1s 4ms/step - loss: 3.9768 - accuracy: 0.1250 - val_loss: 3.8097 - val_accuracy: 0.1968
Epoch 28/200
266/266 [==============================] - 1s 4ms/step - loss: 3.9381 - accuracy: 0.1290 - val_loss: 3.8008 - val_accuracy: 0.1988
Epoch 29/200
266/266 [==============================] - 1s 4ms/step - loss: 3.9279 - accuracy: 0.1360 - val_loss: 3.7949 - val_accuracy: 0.2008
Epoch 30/200
266/266 [==============================] - 1s 4ms/step - loss: 3.9464 - accuracy: 0.1359 - val_loss: 3.7870 - val_accuracy: 0.2048
Epoch 31/200
266/266 [==============================] - 1s 4ms/step - loss: 3.9304 - accuracy: 0.1409 - val_loss: 3.7836 - val_accuracy: 0.2048
Epoch 32/200
266/266 [==============================] - 1s 4ms/step - loss: 3.9089 - accuracy: 0.1460 - val_loss: 3.7752 - val_accuracy: 0.2061
Epoch 33/200
266/266 [==============================] - 1s 4ms/step - loss: 3.9035 - accuracy: 0.1526 - val_loss: 3.7693 - val_accuracy: 0.2081
Epoch 34/200
266/266 [==============================] - 1s 4ms/step - loss: 3.9033 - accuracy: 0.1431 - val_loss: 3.7621 - val_accuracy: 0.2114
Epoch 35/200
266/266 [==============================] - 1s 4ms/step - loss: 3.8892 - accuracy: 0.1531 - val_loss: 3.7563 - val_accuracy: 0.2161
Epoch 36/200
266/266 [==============================] - 1s 4ms/step - loss: 3.8987 - accuracy: 0.1455 - val_loss: 3.7517 - val_accuracy: 0.2094
Epoch 37/200
266/266 [==============================] - 1s 4ms/step - loss: 3.8854 - accuracy: 0.1536 - val_loss: 3.7456 - val_accuracy: 0.2114
Epoch 38/200
266/266 [==============================] - 1s 4ms/step - loss: 3.8543 - accuracy: 0.1612 - val_loss: 3.7385 - val_accuracy: 0.2134
Epoch 39/200
266/266 [==============================] - 1s 4ms/step - loss: 3.8446 - accuracy: 0.1632 - val_loss: 3.7330 - val_accuracy: 0.2148
Epoch 40/200
266/266 [==============================] - 1s 4ms/step - loss: 3.8562 - accuracy: 0.1561 - val_loss: 3.7242 - val_accuracy: 0.2207
Epoch 41/200
266/266 [==============================] - 1s 4ms/step - loss: 3.8753 - accuracy: 0.1508 - val_loss: 3.7220 - val_accuracy: 0.2174
Epoch 42/200
266/266 [==============================] - 1s 4ms/step - loss: 3.8520 - accuracy: 0.1659 - val_loss: 3.7178 - val_accuracy: 0.2201
Epoch 43/200
266/266 [==============================] - 1s 4ms/step - loss: 3.8258 - accuracy: 0.1640 - val_loss: 3.7099 - val_accuracy: 0.2234
Epoch 44/200
266/266 [==============================] - 1s 4ms/step - loss: 3.8243 - accuracy: 0.1606 - val_loss: 3.7037 - val_accuracy: 0.2227
Epoch 45/200
266/266 [==============================] - 1s 4ms/step - loss: 3.7912 - accuracy: 0.1742 - val_loss: 3.6997 - val_accuracy: 0.2194
Epoch 46/200
266/266 [==============================] - 1s 4ms/step - loss: 3.8216 - accuracy: 0.1610 - val_loss: 3.6940 - val_accuracy: 0.2207
Epoch 47/200
266/266 [==============================] - 1s 4ms/step - loss: 3.8058 - accuracy: 0.1677 - val_loss: 3.6869 - val_accuracy: 0.2201
Epoch 48/200
266/266 [==============================] - 1s 4ms/step - loss: 3.7996 - accuracy: 0.1740 - val_loss: 3.6832 - val_accuracy: 0.2201
Epoch 49/200
266/266 [==============================] - 1s 4ms/step - loss: 3.8163 - accuracy: 0.1656 - val_loss: 3.6763 - val_accuracy: 0.2174
Epoch 50/200
266/266 [==============================] - 1s 4ms/step - loss: 3.7989 - accuracy: 0.1720 - val_loss: 3.6688 - val_accuracy: 0.2214
Epoch 51/200
266/266 [==============================] - 1s 4ms/step - loss: 3.7826 - accuracy: 0.1753 - val_loss: 3.6662 - val_accuracy: 0.2221
Epoch 52/200
266/266 [==============================] - 1s 4ms/step - loss: 3.7804 - accuracy: 0.1759 - val_loss: 3.6608 - val_accuracy: 0.2221
Epoch 53/200
266/266 [==============================] - 1s 4ms/step - loss: 3.7778 - accuracy: 0.1710 - val_loss: 3.6556 - val_accuracy: 0.2274
Epoch 54/200
266/266 [==============================] - 1s 4ms/step - loss: 3.7622 - accuracy: 0.1777 - val_loss: 3.6507 - val_accuracy: 0.2221
Epoch 55/200
266/266 [==============================] - 1s 4ms/step - loss: 3.7663 - accuracy: 0.1736 - val_loss: 3.6418 - val_accuracy: 0.2227
Epoch 56/200
266/266 [==============================] - 1s 4ms/step - loss: 3.7454 - accuracy: 0.1788 - val_loss: 3.6380 - val_accuracy: 0.2274
Epoch 57/200
266/266 [==============================] - 1s 4ms/step - loss: 3.7652 - accuracy: 0.1791 - val_loss: 3.6375 - val_accuracy: 0.2254
Epoch 58/200
266/266 [==============================] - 1s 4ms/step - loss: 3.7372 - accuracy: 0.1917 - val_loss: 3.6299 - val_accuracy: 0.2247
Epoch 59/200
266/266 [==============================] - 1s 4ms/step - loss: 3.7363 - accuracy: 0.1855 - val_loss: 3.6239 - val_accuracy: 0.2274
Epoch 60/200
266/266 [==============================] - 1s 4ms/step - loss: 3.7323 - accuracy: 0.1823 - val_loss: 3.6176 - val_accuracy: 0.2287
Epoch 61/200
266/266 [==============================] - 1s 4ms/step - loss: 3.7212 - accuracy: 0.1935 - val_loss: 3.6128 - val_accuracy: 0.2281
Epoch 62/200
266/266 [==============================] - 1s 4ms/step - loss: 3.7107 - accuracy: 0.1958 - val_loss: 3.6109 - val_accuracy: 0.2327
Epoch 63/200
266/266 [==============================] - 1s 4ms/step - loss: 3.7334 - accuracy: 0.1843 - val_loss: 3.6078 - val_accuracy: 0.2294
Epoch 64/200
266/266 [==============================] - 1s 4ms/step - loss: 3.7088 - accuracy: 0.1936 - val_loss: 3.5987 - val_accuracy: 0.2320
Epoch 65/200
266/266 [==============================] - 1s 4ms/step - loss: 3.7038 - accuracy: 0.1897 - val_loss: 3.5991 - val_accuracy: 0.2327
Epoch 66/200
266/266 [==============================] - 1s 4ms/step - loss: 3.7131 - accuracy: 0.1876 - val_loss: 3.5926 - val_accuracy: 0.2334
Epoch 67/200
266/266 [==============================] - 1s 4ms/step - loss: 3.6863 - accuracy: 0.1987 - val_loss: 3.5874 - val_accuracy: 0.2414
Epoch 68/200
266/266 [==============================] - 1s 4ms/step - loss: 3.6963 - accuracy: 0.1973 - val_loss: 3.5806 - val_accuracy: 0.2420
Epoch 69/200
266/266 [==============================] - 1s 4ms/step - loss: 3.6706 - accuracy: 0.1928 - val_loss: 3.5785 - val_accuracy: 0.2374
Epoch 70/200
266/266 [==============================] - 1s 4ms/step - loss: 3.6834 - accuracy: 0.1969 - val_loss: 3.5752 - val_accuracy: 0.2374
Epoch 71/200
266/266 [==============================] - 1s 4ms/step - loss: 3.6764 - accuracy: 0.1905 - val_loss: 3.5704 - val_accuracy: 0.2387
Epoch 72/200
266/266 [==============================] - 1s 4ms/step - loss: 3.6642 - accuracy: 0.2079 - val_loss: 3.5637 - val_accuracy: 0.2407
Epoch 73/200
266/266 [==============================] - 1s 4ms/step - loss: 3.6714 - accuracy: 0.2054 - val_loss: 3.5560 - val_accuracy: 0.2447
Epoch 74/200
266/266 [==============================] - 1s 4ms/step - loss: 3.6636 - accuracy: 0.1971 - val_loss: 3.5494 - val_accuracy: 0.2460
Epoch 75/200
266/266 [==============================] - 1s 4ms/step - loss: 3.6527 - accuracy: 0.2056 - val_loss: 3.5494 - val_accuracy: 0.2427
Epoch 76/200
266/266 [==============================] - 1s 4ms/step - loss: 3.6597 - accuracy: 0.2033 - val_loss: 3.5428 - val_accuracy: 0.2480
Epoch 77/200
266/266 [==============================] - 1s 4ms/step - loss: 3.6325 - accuracy: 0.2111 - val_loss: 3.5370 - val_accuracy: 0.2500
Epoch 78/200
266/266 [==============================] - 1s 4ms/step - loss: 3.6182 - accuracy: 0.2175 - val_loss: 3.5365 - val_accuracy: 0.2500
Epoch 79/200
266/266 [==============================] - 1s 4ms/step - loss: 3.6333 - accuracy: 0.2150 - val_loss: 3.5310 - val_accuracy: 0.2487
Epoch 80/200
266/266 [==============================] - 1s 4ms/step - loss: 3.6413 - accuracy: 0.2084 - val_loss: 3.5257 - val_accuracy: 0.2527
Epoch 81/200
266/266 [==============================] - 1s 4ms/step - loss: 3.6528 - accuracy: 0.2086 - val_loss: 3.5224 - val_accuracy: 0.2487
Epoch 82/200
266/266 [==============================] - 1s 4ms/step - loss: 3.6284 - accuracy: 0.2148 - val_loss: 3.5162 - val_accuracy: 0.2493
Epoch 83/200
266/266 [==============================] - 1s 4ms/step - loss: 3.6105 - accuracy: 0.2153 - val_loss: 3.5165 - val_accuracy: 0.2487
Epoch 84/200
266/266 [==============================] - 1s 4ms/step - loss: 3.5963 - accuracy: 0.2158 - val_loss: 3.5092 - val_accuracy: 0.2500
Epoch 85/200
266/266 [==============================] - 1s 4ms/step - loss: 3.6227 - accuracy: 0.2072 - val_loss: 3.5015 - val_accuracy: 0.2553
Epoch 86/200
266/266 [==============================] - 1s 4ms/step - loss: 3.5973 - accuracy: 0.2136 - val_loss: 3.5011 - val_accuracy: 0.2520
Epoch 87/200
266/266 [==============================] - 1s 4ms/step - loss: 3.5951 - accuracy: 0.2175 - val_loss: 3.4906 - val_accuracy: 0.2580
Epoch 88/200
266/266 [==============================] - 1s 4ms/step - loss: 3.5935 - accuracy: 0.2143 - val_loss: 3.4973 - val_accuracy: 0.2500
Epoch 89/200
266/266 [==============================] - 1s 4ms/step - loss: 3.5710 - accuracy: 0.2272 - val_loss: 3.4808 - val_accuracy: 0.2620
Epoch 90/200
266/266 [==============================] - 1s 4ms/step - loss: 3.5900 - accuracy: 0.2173 - val_loss: 3.4855 - val_accuracy: 0.2573
Epoch 91/200
266/266 [==============================] - 1s 4ms/step - loss: 3.5596 - accuracy: 0.2325 - val_loss: 3.4799 - val_accuracy: 0.2606
Epoch 92/200
266/266 [==============================] - 1s 4ms/step - loss: 3.5758 - accuracy: 0.2198 - val_loss: 3.4749 - val_accuracy: 0.2613
Epoch 93/200
266/266 [==============================] - 1s 4ms/step - loss: 3.5619 - accuracy: 0.2280 - val_loss: 3.4690 - val_accuracy: 0.2633
Epoch 94/200
266/266 [==============================] - 1s 4ms/step - loss: 3.5557 - accuracy: 0.2340 - val_loss: 3.4654 - val_accuracy: 0.2626
Epoch 95/200
266/266 [==============================] - 1s 4ms/step - loss: 3.5554 - accuracy: 0.2304 - val_loss: 3.4616 - val_accuracy: 0.2620
Epoch 96/200
266/266 [==============================] - 1s 4ms/step - loss: 3.5356 - accuracy: 0.2379 - val_loss: 3.4619 - val_accuracy: 0.2660
Epoch 97/200
266/266 [==============================] - 1s 4ms/step - loss: 3.5540 - accuracy: 0.2254 - val_loss: 3.4513 - val_accuracy: 0.2660
Epoch 98/200
266/266 [==============================] - 1s 4ms/step - loss: 3.5301 - accuracy: 0.2380 - val_loss: 3.4469 - val_accuracy: 0.2706
Epoch 99/200
266/266 [==============================] - 1s 4ms/step - loss: 3.5371 - accuracy: 0.2354 - val_loss: 3.4414 - val_accuracy: 0.2699
Epoch 100/200
266/266 [==============================] - 1s 4ms/step - loss: 3.5394 - accuracy: 0.2357 - val_loss: 3.4412 - val_accuracy: 0.2746
Epoch 101/200
266/266 [==============================] - 1s 4ms/step - loss: 3.5320 - accuracy: 0.2281 - val_loss: 3.4353 - val_accuracy: 0.2726
Epoch 102/200
266/266 [==============================] - 1s 4ms/step - loss: 3.5095 - accuracy: 0.2399 - val_loss: 3.4259 - val_accuracy: 0.2719
Epoch 103/200
266/266 [==============================] - 1s 4ms/step - loss: 3.5402 - accuracy: 0.2344 - val_loss: 3.4322 - val_accuracy: 0.2726
Epoch 104/200
266/266 [==============================] - 1s 4ms/step - loss: 3.5204 - accuracy: 0.2323 - val_loss: 3.4290 - val_accuracy: 0.2753
Epoch 105/200
266/266 [==============================] - 1s 4ms/step - loss: 3.5108 - accuracy: 0.2381 - val_loss: 3.4157 - val_accuracy: 0.2793
Epoch 106/200
266/266 [==============================] - 1s 4ms/step - loss: 3.4987 - accuracy: 0.2332 - val_loss: 3.4146 - val_accuracy: 0.2759
Epoch 107/200
266/266 [==============================] - 1s 4ms/step - loss: 3.4958 - accuracy: 0.2471 - val_loss: 3.4088 - val_accuracy: 0.2786
Epoch 108/200
266/266 [==============================] - 1s 4ms/step - loss: 3.4880 - accuracy: 0.2493 - val_loss: 3.4038 - val_accuracy: 0.2799
Epoch 109/200
266/266 [==============================] - 1s 4ms/step - loss: 3.4893 - accuracy: 0.2448 - val_loss: 3.4021 - val_accuracy: 0.2806
Epoch 110/200
266/266 [==============================] - 1s 4ms/step - loss: 3.5145 - accuracy: 0.2339 - val_loss: 3.3993 - val_accuracy: 0.2799
Epoch 111/200
266/266 [==============================] - 1s 4ms/step - loss: 3.4840 - accuracy: 0.2493 - val_loss: 3.3956 - val_accuracy: 0.2832
Epoch 112/200
266/266 [==============================] - 1s 4ms/step - loss: 3.4698 - accuracy: 0.2548 - val_loss: 3.3867 - val_accuracy: 0.2819
Epoch 113/200
266/266 [==============================] - 1s 4ms/step - loss: 3.4931 - accuracy: 0.2488 - val_loss: 3.3897 - val_accuracy: 0.2839
Epoch 114/200
266/266 [==============================] - 1s 4ms/step - loss: 3.4600 - accuracy: 0.2457 - val_loss: 3.3798 - val_accuracy: 0.2872
Epoch 115/200
266/266 [==============================] - 1s 4ms/step - loss: 3.4707 - accuracy: 0.2492 - val_loss: 3.3749 - val_accuracy: 0.2859
Epoch 116/200
266/266 [==============================] - 1s 4ms/step - loss: 3.4811 - accuracy: 0.2400 - val_loss: 3.3762 - val_accuracy: 0.2852
Epoch 117/200
266/266 [==============================] - 1s 4ms/step - loss: 3.4764 - accuracy: 0.2478 - val_loss: 3.3699 - val_accuracy: 0.2886
Epoch 118/200
266/266 [==============================] - 1s 4ms/step - loss: 3.4700 - accuracy: 0.2442 - val_loss: 3.3600 - val_accuracy: 0.2906
Epoch 119/200
266/266 [==============================] - 1s 4ms/step - loss: 3.4331 - accuracy: 0.2561 - val_loss: 3.3551 - val_accuracy: 0.2932
Epoch 120/200
266/266 [==============================] - 1s 4ms/step - loss: 3.4515 - accuracy: 0.2537 - val_loss: 3.3535 - val_accuracy: 0.2906
Epoch 121/200
266/266 [==============================] - 1s 4ms/step - loss: 3.4443 - accuracy: 0.2517 - val_loss: 3.3541 - val_accuracy: 0.2906
Epoch 122/200
266/266 [==============================] - 1s 4ms/step - loss: 3.4419 - accuracy: 0.2607 - val_loss: 3.3500 - val_accuracy: 0.2952
Epoch 123/200
266/266 [==============================] - 1s 4ms/step - loss: 3.4598 - accuracy: 0.2467 - val_loss: 3.3443 - val_accuracy: 0.2919
Epoch 124/200
266/266 [==============================] - 1s 4ms/step - loss: 3.4427 - accuracy: 0.2458 - val_loss: 3.3420 - val_accuracy: 0.2932
Epoch 125/200
266/266 [==============================] - 1s 4ms/step - loss: 3.3999 - accuracy: 0.2646 - val_loss: 3.3326 - val_accuracy: 0.2952
Epoch 126/200
266/266 [==============================] - 1s 4ms/step - loss: 3.3963 - accuracy: 0.2681 - val_loss: 3.3282 - val_accuracy: 0.2952
Epoch 127/200
266/266 [==============================] - 1s 4ms/step - loss: 3.4132 - accuracy: 0.2554 - val_loss: 3.3278 - val_accuracy: 0.2985
Epoch 128/200
266/266 [==============================] - 1s 4ms/step - loss: 3.4084 - accuracy: 0.2608 - val_loss: 3.3204 - val_accuracy: 0.2992
Epoch 129/200
266/266 [==============================] - 1s 4ms/step - loss: 3.4043 - accuracy: 0.2596 - val_loss: 3.3208 - val_accuracy: 0.2959
Epoch 130/200
266/266 [==============================] - 1s 4ms/step - loss: 3.4121 - accuracy: 0.2623 - val_loss: 3.3216 - val_accuracy: 0.2965
Epoch 131/200
266/266 [==============================] - 1s 4ms/step - loss: 3.4068 - accuracy: 0.2646 - val_loss: 3.3099 - val_accuracy: 0.3012
Epoch 132/200
266/266 [==============================] - 1s 4ms/step - loss: 3.4048 - accuracy: 0.2636 - val_loss: 3.3005 - val_accuracy: 0.3012
Epoch 133/200
266/266 [==============================] - 1s 4ms/step - loss: 3.3860 - accuracy: 0.2578 - val_loss: 3.3010 - val_accuracy: 0.3025
Epoch 134/200
266/266 [==============================] - 1s 4ms/step - loss: 3.3847 - accuracy: 0.2644 - val_loss: 3.2961 - val_accuracy: 0.3052
Epoch 135/200
266/266 [==============================] - 1s 4ms/step - loss: 3.4031 - accuracy: 0.2520 - val_loss: 3.2928 - val_accuracy: 0.3065
Epoch 136/200
266/266 [==============================] - 1s 4ms/step - loss: 3.3978 - accuracy: 0.2654 - val_loss: 3.2915 - val_accuracy: 0.3125
Epoch 137/200
266/266 [==============================] - 1s 4ms/step - loss: 3.3795 - accuracy: 0.2725 - val_loss: 3.2814 - val_accuracy: 0.3152
Epoch 138/200
266/266 [==============================] - 1s 4ms/step - loss: 3.3625 - accuracy: 0.2722 - val_loss: 3.2764 - val_accuracy: 0.3138
Epoch 139/200
266/266 [==============================] - 1s 4ms/step - loss: 3.3849 - accuracy: 0.2689 - val_loss: 3.2749 - val_accuracy: 0.3152
Epoch 140/200
266/266 [==============================] - 1s 4ms/step - loss: 3.3623 - accuracy: 0.2708 - val_loss: 3.2691 - val_accuracy: 0.3158
Epoch 141/200
266/266 [==============================] - 1s 4ms/step - loss: 3.3547 - accuracy: 0.2752 - val_loss: 3.2664 - val_accuracy: 0.3198
Epoch 142/200
266/266 [==============================] - 1s 4ms/step - loss: 3.3565 - accuracy: 0.2734 - val_loss: 3.2565 - val_accuracy: 0.3198
Epoch 143/200
266/266 [==============================] - 1s 4ms/step - loss: 3.3614 - accuracy: 0.2716 - val_loss: 3.2600 - val_accuracy: 0.3245
Epoch 144/200
266/266 [==============================] - 1s 4ms/step - loss: 3.3335 - accuracy: 0.2822 - val_loss: 3.2538 - val_accuracy: 0.3198
Epoch 145/200
266/266 [==============================] - 1s 4ms/step - loss: 3.3387 - accuracy: 0.2902 - val_loss: 3.2472 - val_accuracy: 0.3245
Epoch 146/200
266/266 [==============================] - 1s 4ms/step - loss: 3.3473 - accuracy: 0.2673 - val_loss: 3.2403 - val_accuracy: 0.3231
Epoch 147/200
266/266 [==============================] - 1s 4ms/step - loss: 3.3523 - accuracy: 0.2780 - val_loss: 3.2334 - val_accuracy: 0.3238
Epoch 148/200
266/266 [==============================] - 1s 4ms/step - loss: 3.3422 - accuracy: 0.2878 - val_loss: 3.2358 - val_accuracy: 0.3231
Epoch 149/200
266/266 [==============================] - 1s 4ms/step - loss: 3.3114 - accuracy: 0.2812 - val_loss: 3.2331 - val_accuracy: 0.3225
Epoch 150/200
266/266 [==============================] - 1s 4ms/step - loss: 3.3113 - accuracy: 0.2862 - val_loss: 3.2290 - val_accuracy: 0.3251
Epoch 151/200
266/266 [==============================] - 1s 4ms/step - loss: 3.3082 - accuracy: 0.2879 - val_loss: 3.2219 - val_accuracy: 0.3258
Epoch 152/200
266/266 [==============================] - 1s 4ms/step - loss: 3.2997 - accuracy: 0.2880 - val_loss: 3.2178 - val_accuracy: 0.3271
Epoch 153/200
266/266 [==============================] - 1s 4ms/step - loss: 3.3155 - accuracy: 0.2849 - val_loss: 3.2202 - val_accuracy: 0.3225
Epoch 154/200
266/266 [==============================] - 1s 4ms/step - loss: 3.2992 - accuracy: 0.2876 - val_loss: 3.2114 - val_accuracy: 0.3238
Epoch 155/200
266/266 [==============================] - 1s 4ms/step - loss: 3.3050 - accuracy: 0.2826 - val_loss: 3.2055 - val_accuracy: 0.3298
Epoch 156/200
266/266 [==============================] - 1s 4ms/step - loss: 3.2910 - accuracy: 0.2778 - val_loss: 3.1999 - val_accuracy: 0.3338
Epoch 157/200
266/266 [==============================] - 1s 4ms/step - loss: 3.2735 - accuracy: 0.2998 - val_loss: 3.1972 - val_accuracy: 0.3324
Epoch 158/200
266/266 [==============================] - 1s 4ms/step - loss: 3.3045 - accuracy: 0.2857 - val_loss: 3.1933 - val_accuracy: 0.3351
Epoch 159/200
266/266 [==============================] - 1s 4ms/step - loss: 3.2937 - accuracy: 0.2885 - val_loss: 3.1860 - val_accuracy: 0.3324
Epoch 160/200
266/266 [==============================] - 1s 4ms/step - loss: 3.3065 - accuracy: 0.2807 - val_loss: 3.1876 - val_accuracy: 0.3364
Epoch 161/200
266/266 [==============================] - 1s 4ms/step - loss: 3.2822 - accuracy: 0.2830 - val_loss: 3.1800 - val_accuracy: 0.3331
Epoch 162/200
266/266 [==============================] - 1s 4ms/step - loss: 3.2855 - accuracy: 0.2900 - val_loss: 3.1829 - val_accuracy: 0.3291
Epoch 163/200
266/266 [==============================] - 1s 4ms/step - loss: 3.2576 - accuracy: 0.2929 - val_loss: 3.1757 - val_accuracy: 0.3378
Epoch 164/200
266/266 [==============================] - 1s 4ms/step - loss: 3.2778 - accuracy: 0.2900 - val_loss: 3.1730 - val_accuracy: 0.3404
Epoch 165/200
266/266 [==============================] - 1s 4ms/step - loss: 3.2527 - accuracy: 0.3026 - val_loss: 3.1690 - val_accuracy: 0.3311
Epoch 166/200
266/266 [==============================] - 1s 4ms/step - loss: 3.2611 - accuracy: 0.2874 - val_loss: 3.1628 - val_accuracy: 0.3351
Epoch 167/200
266/266 [==============================] - 1s 4ms/step - loss: 3.2566 - accuracy: 0.3017 - val_loss: 3.1610 - val_accuracy: 0.3324
Epoch 168/200
266/266 [==============================] - 1s 4ms/step - loss: 3.2629 - accuracy: 0.2991 - val_loss: 3.1597 - val_accuracy: 0.3424
Epoch 169/200
266/266 [==============================] - 1s 4ms/step - loss: 3.2448 - accuracy: 0.3007 - val_loss: 3.1470 - val_accuracy: 0.3438
Epoch 170/200
266/266 [==============================] - 1s 4ms/step - loss: 3.2611 - accuracy: 0.2864 - val_loss: 3.1511 - val_accuracy: 0.3424
Epoch 171/200
266/266 [==============================] - 1s 4ms/step - loss: 3.2304 - accuracy: 0.2978 - val_loss: 3.1429 - val_accuracy: 0.3457
Epoch 172/200
266/266 [==============================] - 1s 4ms/step - loss: 3.2501 - accuracy: 0.2890 - val_loss: 3.1377 - val_accuracy: 0.3497
Epoch 173/200
266/266 [==============================] - 1s 4ms/step - loss: 3.2301 - accuracy: 0.3020 - val_loss: 3.1362 - val_accuracy: 0.3418
Epoch 174/200
266/266 [==============================] - 1s 4ms/step - loss: 3.2294 - accuracy: 0.3088 - val_loss: 3.1388 - val_accuracy: 0.3431
Epoch 175/200
266/266 [==============================] - 1s 4ms/step - loss: 3.2320 - accuracy: 0.3051 - val_loss: 3.1292 - val_accuracy: 0.3471
Epoch 176/200
266/266 [==============================] - 1s 4ms/step - loss: 3.2031 - accuracy: 0.3063 - val_loss: 3.1227 - val_accuracy: 0.3457
Epoch 177/200
266/266 [==============================] - 1s 4ms/step - loss: 3.2269 - accuracy: 0.2963 - val_loss: 3.1177 - val_accuracy: 0.3511
Epoch 178/200
266/266 [==============================] - 1s 4ms/step - loss: 3.2292 - accuracy: 0.2959 - val_loss: 3.1137 - val_accuracy: 0.3517
Epoch 179/200
266/266 [==============================] - 1s 4ms/step - loss: 3.1960 - accuracy: 0.3181 - val_loss: 3.1143 - val_accuracy: 0.3457
Epoch 180/200
266/266 [==============================] - 1s 4ms/step - loss: 3.1933 - accuracy: 0.3025 - val_loss: 3.1069 - val_accuracy: 0.3531
Epoch 181/200
266/266 [==============================] - 1s 4ms/step - loss: 3.2070 - accuracy: 0.3113 - val_loss: 3.1079 - val_accuracy: 0.3471
Epoch 182/200
266/266 [==============================] - 1s 4ms/step - loss: 3.2095 - accuracy: 0.3021 - val_loss: 3.0976 - val_accuracy: 0.3477
Epoch 183/200
266/266 [==============================] - 1s 4ms/step - loss: 3.1636 - accuracy: 0.3162 - val_loss: 3.1035 - val_accuracy: 0.3484
Epoch 184/200
266/266 [==============================] - 1s 4ms/step - loss: 3.2136 - accuracy: 0.3025 - val_loss: 3.0872 - val_accuracy: 0.3551
Epoch 185/200
266/266 [==============================] - 1s 4ms/step - loss: 3.1900 - accuracy: 0.3133 - val_loss: 3.0921 - val_accuracy: 0.3511
Epoch 186/200
266/266 [==============================] - 1s 4ms/step - loss: 3.1734 - accuracy: 0.3134 - val_loss: 3.0912 - val_accuracy: 0.3517
Epoch 187/200
266/266 [==============================] - 1s 4ms/step - loss: 3.1746 - accuracy: 0.3057 - val_loss: 3.0777 - val_accuracy: 0.3504
Epoch 188/200
266/266 [==============================] - 1s 4ms/step - loss: 3.1800 - accuracy: 0.3146 - val_loss: 3.0822 - val_accuracy: 0.3477
Epoch 189/200
266/266 [==============================] - 1s 4ms/step - loss: 3.1780 - accuracy: 0.3151 - val_loss: 3.0747 - val_accuracy: 0.3557
Epoch 190/200
266/266 [==============================] - 1s 4ms/step - loss: 3.1691 - accuracy: 0.3115 - val_loss: 3.0737 - val_accuracy: 0.3524
Epoch 191/200
266/266 [==============================] - 1s 4ms/step - loss: 3.1650 - accuracy: 0.3127 - val_loss: 3.0761 - val_accuracy: 0.3491
Epoch 192/200
266/266 [==============================] - 1s 4ms/step - loss: 3.1862 - accuracy: 0.3103 - val_loss: 3.0585 - val_accuracy: 0.3551
Epoch 193/200
266/266 [==============================] - 1s 4ms/step - loss: 3.1540 - accuracy: 0.3159 - val_loss: 3.0597 - val_accuracy: 0.3544
Epoch 194/200
266/266 [==============================] - 1s 4ms/step - loss: 3.1601 - accuracy: 0.3171 - val_loss: 3.0573 - val_accuracy: 0.3537
Epoch 195/200
266/266 [==============================] - 1s 4ms/step - loss: 3.1581 - accuracy: 0.3140 - val_loss: 3.0499 - val_accuracy: 0.3577
Epoch 196/200
266/266 [==============================] - 1s 4ms/step - loss: 3.1355 - accuracy: 0.3294 - val_loss: 3.0520 - val_accuracy: 0.3557
Epoch 197/200
266/266 [==============================] - 1s 4ms/step - loss: 3.1289 - accuracy: 0.3266 - val_loss: 3.0550 - val_accuracy: 0.3610
Epoch 198/200
266/266 [==============================] - 1s 4ms/step - loss: 3.1422 - accuracy: 0.3068 - val_loss: 3.0554 - val_accuracy: 0.3570
Epoch 199/200
266/266 [==============================] - 1s 4ms/step - loss: 3.1460 - accuracy: 0.3132 - val_loss: 3.0406 - val_accuracy: 0.3584
Epoch 200/200
266/266 [==============================] - 1s 4ms/step - loss: 3.1432 - accuracy: 0.3160 - val_loss: 3.0381 - val_accuracy: 0.3577
In [ ]:
_, accuracy = model_report(SIMPLE_MODEL_OPTIMIZED, SIMPLE_MODEL_OPTIMIZED_history)
accuracies_opt_SGD["SIMPLE_MODEL"] = accuracy
Test set evaluation metrics
---------------------------
Loss:     3.065
Accuracy: 34.524%
CNN1
In [ ]:
CNN1_MODEL_OPTIMIZED = init_cnn1_model_optimized(summary = True, optimizer = tf.optimizers.SGD)
CNN1_MODEL_OPTIMIZED_history = train_model(CNN1_MODEL_OPTIMIZED, epochs = 200, callbacks=[callback])
Model: "sequential_21"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
conv2d_23 (Conv2D)           (None, 30, 30, 32)        896       
_________________________________________________________________
batch_normalization_13 (Batc (None, 30, 30, 32)        128       
_________________________________________________________________
re_lu_13 (ReLU)              (None, 30, 30, 32)        0         
_________________________________________________________________
max_pooling2d_16 (MaxPooling (None, 15, 15, 32)        0         
_________________________________________________________________
dropout_29 (Dropout)         (None, 15, 15, 32)        0         
_________________________________________________________________
conv2d_24 (Conv2D)           (None, 13, 13, 64)        18496     
_________________________________________________________________
batch_normalization_14 (Batc (None, 13, 13, 64)        256       
_________________________________________________________________
re_lu_14 (ReLU)              (None, 13, 13, 64)        0         
_________________________________________________________________
max_pooling2d_17 (MaxPooling (None, 6, 6, 64)          0         
_________________________________________________________________
dropout_30 (Dropout)         (None, 6, 6, 64)          0         
_________________________________________________________________
conv2d_25 (Conv2D)           (None, 4, 4, 128)         73856     
_________________________________________________________________
batch_normalization_15 (Batc (None, 4, 4, 128)         512       
_________________________________________________________________
re_lu_15 (ReLU)              (None, 4, 4, 128)         0         
_________________________________________________________________
average_pooling2d_2 (Average (None, 2, 2, 128)         0         
_________________________________________________________________
dropout_31 (Dropout)         (None, 2, 2, 128)         0         
_________________________________________________________________
flatten_7 (Flatten)          (None, 512)               0         
_________________________________________________________________
dense_29 (Dense)             (None, 1024)              525312    
_________________________________________________________________
dropout_32 (Dropout)         (None, 1024)              0         
_________________________________________________________________
dense_30 (Dense)             (None, 20)                20500     
=================================================================
Total params: 639,956
Trainable params: 639,508
Non-trainable params: 448
_________________________________________________________________
Epoch 1/200
266/266 [==============================] - 2s 5ms/step - loss: 4.5147 - accuracy: 0.0425 - val_loss: 4.3733 - val_accuracy: 0.0519
Epoch 2/200
266/266 [==============================] - 1s 4ms/step - loss: 4.4941 - accuracy: 0.0514 - val_loss: 4.3556 - val_accuracy: 0.0559
Epoch 3/200
266/266 [==============================] - 1s 4ms/step - loss: 4.4586 - accuracy: 0.0561 - val_loss: 4.3282 - val_accuracy: 0.0758
Epoch 4/200
266/266 [==============================] - 1s 4ms/step - loss: 4.4272 - accuracy: 0.0563 - val_loss: 4.3075 - val_accuracy: 0.0924
Epoch 5/200
266/266 [==============================] - 1s 4ms/step - loss: 4.4038 - accuracy: 0.0579 - val_loss: 4.2892 - val_accuracy: 0.1024
Epoch 6/200
266/266 [==============================] - 1s 4ms/step - loss: 4.3907 - accuracy: 0.0692 - val_loss: 4.2722 - val_accuracy: 0.1124
Epoch 7/200
266/266 [==============================] - 1s 4ms/step - loss: 4.3586 - accuracy: 0.0640 - val_loss: 4.2555 - val_accuracy: 0.1243
Epoch 8/200
266/266 [==============================] - 1s 4ms/step - loss: 4.3623 - accuracy: 0.0706 - val_loss: 4.2399 - val_accuracy: 0.1297
Epoch 9/200
266/266 [==============================] - 1s 4ms/step - loss: 4.3328 - accuracy: 0.0766 - val_loss: 4.2227 - val_accuracy: 0.1343
Epoch 10/200
266/266 [==============================] - 1s 4ms/step - loss: 4.3257 - accuracy: 0.0805 - val_loss: 4.2082 - val_accuracy: 0.1430
Epoch 11/200
266/266 [==============================] - 1s 4ms/step - loss: 4.3042 - accuracy: 0.0879 - val_loss: 4.1939 - val_accuracy: 0.1509
Epoch 12/200
266/266 [==============================] - 1s 4ms/step - loss: 4.2750 - accuracy: 0.0930 - val_loss: 4.1762 - val_accuracy: 0.1543
Epoch 13/200
266/266 [==============================] - 1s 4ms/step - loss: 4.2660 - accuracy: 0.1050 - val_loss: 4.1641 - val_accuracy: 0.1536
Epoch 14/200
266/266 [==============================] - 1s 4ms/step - loss: 4.2510 - accuracy: 0.1057 - val_loss: 4.1503 - val_accuracy: 0.1622
Epoch 15/200
266/266 [==============================] - 1s 4ms/step - loss: 4.2315 - accuracy: 0.1082 - val_loss: 4.1358 - val_accuracy: 0.1682
Epoch 16/200
266/266 [==============================] - 1s 4ms/step - loss: 4.2243 - accuracy: 0.1104 - val_loss: 4.1236 - val_accuracy: 0.1749
Epoch 17/200
266/266 [==============================] - 1s 4ms/step - loss: 4.2147 - accuracy: 0.1164 - val_loss: 4.1105 - val_accuracy: 0.1762
Epoch 18/200
266/266 [==============================] - 1s 4ms/step - loss: 4.2026 - accuracy: 0.1210 - val_loss: 4.0998 - val_accuracy: 0.1795
Epoch 19/200
266/266 [==============================] - 1s 4ms/step - loss: 4.1847 - accuracy: 0.1263 - val_loss: 4.0854 - val_accuracy: 0.1795
Epoch 20/200
266/266 [==============================] - 1s 4ms/step - loss: 4.1733 - accuracy: 0.1241 - val_loss: 4.0762 - val_accuracy: 0.1815
Epoch 21/200
266/266 [==============================] - 1s 4ms/step - loss: 4.1698 - accuracy: 0.1261 - val_loss: 4.0646 - val_accuracy: 0.1875
Epoch 22/200
266/266 [==============================] - 1s 4ms/step - loss: 4.1449 - accuracy: 0.1362 - val_loss: 4.0537 - val_accuracy: 0.1882
Epoch 23/200
266/266 [==============================] - 1s 4ms/step - loss: 4.1481 - accuracy: 0.1329 - val_loss: 4.0459 - val_accuracy: 0.1875
Epoch 24/200
266/266 [==============================] - 1s 4ms/step - loss: 4.1321 - accuracy: 0.1415 - val_loss: 4.0379 - val_accuracy: 0.1868
Epoch 25/200
266/266 [==============================] - 1s 4ms/step - loss: 4.1302 - accuracy: 0.1377 - val_loss: 4.0261 - val_accuracy: 0.1908
Epoch 26/200
266/266 [==============================] - 1s 4ms/step - loss: 4.1052 - accuracy: 0.1462 - val_loss: 4.0173 - val_accuracy: 0.1895
Epoch 27/200
266/266 [==============================] - 1s 4ms/step - loss: 4.1143 - accuracy: 0.1429 - val_loss: 4.0097 - val_accuracy: 0.1915
Epoch 28/200
266/266 [==============================] - 1s 4ms/step - loss: 4.0927 - accuracy: 0.1515 - val_loss: 4.0009 - val_accuracy: 0.1888
Epoch 29/200
266/266 [==============================] - 1s 4ms/step - loss: 4.0955 - accuracy: 0.1525 - val_loss: 3.9942 - val_accuracy: 0.1915
Epoch 30/200
266/266 [==============================] - 1s 4ms/step - loss: 4.0883 - accuracy: 0.1527 - val_loss: 3.9855 - val_accuracy: 0.1948
Epoch 31/200
266/266 [==============================] - 1s 4ms/step - loss: 4.0804 - accuracy: 0.1587 - val_loss: 3.9773 - val_accuracy: 0.1935
Epoch 32/200
266/266 [==============================] - 1s 4ms/step - loss: 4.0675 - accuracy: 0.1551 - val_loss: 3.9710 - val_accuracy: 0.1961
Epoch 33/200
266/266 [==============================] - 1s 4ms/step - loss: 4.0546 - accuracy: 0.1522 - val_loss: 3.9659 - val_accuracy: 0.1968
Epoch 34/200
266/266 [==============================] - 1s 4ms/step - loss: 4.0543 - accuracy: 0.1560 - val_loss: 3.9615 - val_accuracy: 0.1961
Epoch 35/200
266/266 [==============================] - 1s 4ms/step - loss: 4.0478 - accuracy: 0.1619 - val_loss: 3.9535 - val_accuracy: 0.1975
Epoch 36/200
266/266 [==============================] - 1s 4ms/step - loss: 4.0581 - accuracy: 0.1516 - val_loss: 3.9430 - val_accuracy: 0.2015
Epoch 37/200
266/266 [==============================] - 1s 4ms/step - loss: 4.0186 - accuracy: 0.1730 - val_loss: 3.9367 - val_accuracy: 0.2035
Epoch 38/200
266/266 [==============================] - 1s 4ms/step - loss: 4.0262 - accuracy: 0.1663 - val_loss: 3.9311 - val_accuracy: 0.2048
Epoch 39/200
266/266 [==============================] - 1s 4ms/step - loss: 3.9987 - accuracy: 0.1775 - val_loss: 3.9276 - val_accuracy: 0.2021
Epoch 40/200
266/266 [==============================] - 1s 4ms/step - loss: 4.0000 - accuracy: 0.1768 - val_loss: 3.9199 - val_accuracy: 0.2055
Epoch 41/200
266/266 [==============================] - 1s 4ms/step - loss: 3.9975 - accuracy: 0.1702 - val_loss: 3.9116 - val_accuracy: 0.2088
Epoch 42/200
266/266 [==============================] - 1s 4ms/step - loss: 4.0087 - accuracy: 0.1651 - val_loss: 3.9078 - val_accuracy: 0.2068
Epoch 43/200
266/266 [==============================] - 1s 4ms/step - loss: 3.9908 - accuracy: 0.1736 - val_loss: 3.9027 - val_accuracy: 0.2114
Epoch 44/200
266/266 [==============================] - 1s 4ms/step - loss: 3.9954 - accuracy: 0.1731 - val_loss: 3.8950 - val_accuracy: 0.2134
Epoch 45/200
266/266 [==============================] - 1s 4ms/step - loss: 3.9777 - accuracy: 0.1684 - val_loss: 3.8895 - val_accuracy: 0.2134
Epoch 46/200
266/266 [==============================] - 1s 4ms/step - loss: 3.9595 - accuracy: 0.1840 - val_loss: 3.8850 - val_accuracy: 0.2154
Epoch 47/200
266/266 [==============================] - 1s 4ms/step - loss: 3.9576 - accuracy: 0.1758 - val_loss: 3.8806 - val_accuracy: 0.2154
Epoch 48/200
266/266 [==============================] - 1s 4ms/step - loss: 3.9672 - accuracy: 0.1814 - val_loss: 3.8770 - val_accuracy: 0.2148
Epoch 49/200
266/266 [==============================] - 1s 4ms/step - loss: 3.9604 - accuracy: 0.1887 - val_loss: 3.8709 - val_accuracy: 0.2161
Epoch 50/200
266/266 [==============================] - 1s 4ms/step - loss: 3.9504 - accuracy: 0.1876 - val_loss: 3.8624 - val_accuracy: 0.2188
Epoch 51/200
266/266 [==============================] - 1s 4ms/step - loss: 3.9397 - accuracy: 0.1864 - val_loss: 3.8619 - val_accuracy: 0.2161
Epoch 52/200
266/266 [==============================] - 1s 4ms/step - loss: 3.9367 - accuracy: 0.1881 - val_loss: 3.8545 - val_accuracy: 0.2214
Epoch 53/200
266/266 [==============================] - 1s 4ms/step - loss: 3.9500 - accuracy: 0.1795 - val_loss: 3.8480 - val_accuracy: 0.2227
Epoch 54/200
266/266 [==============================] - 1s 4ms/step - loss: 3.9297 - accuracy: 0.1849 - val_loss: 3.8462 - val_accuracy: 0.2207
Epoch 55/200
266/266 [==============================] - 1s 4ms/step - loss: 3.9196 - accuracy: 0.1982 - val_loss: 3.8387 - val_accuracy: 0.2261
Epoch 56/200
266/266 [==============================] - 1s 4ms/step - loss: 3.9068 - accuracy: 0.1923 - val_loss: 3.8357 - val_accuracy: 0.2241
Epoch 57/200
266/266 [==============================] - 1s 4ms/step - loss: 3.8941 - accuracy: 0.2052 - val_loss: 3.8263 - val_accuracy: 0.2314
Epoch 58/200
266/266 [==============================] - 1s 4ms/step - loss: 3.9024 - accuracy: 0.1919 - val_loss: 3.8243 - val_accuracy: 0.2287
Epoch 59/200
266/266 [==============================] - 1s 4ms/step - loss: 3.8925 - accuracy: 0.1876 - val_loss: 3.8200 - val_accuracy: 0.2267
Epoch 60/200
266/266 [==============================] - 1s 4ms/step - loss: 3.8740 - accuracy: 0.2022 - val_loss: 3.8174 - val_accuracy: 0.2267
Epoch 61/200
266/266 [==============================] - 1s 4ms/step - loss: 3.8737 - accuracy: 0.2072 - val_loss: 3.8116 - val_accuracy: 0.2274
Epoch 62/200
266/266 [==============================] - 1s 4ms/step - loss: 3.8907 - accuracy: 0.1981 - val_loss: 3.8064 - val_accuracy: 0.2294
Epoch 63/200
266/266 [==============================] - 1s 4ms/step - loss: 3.8805 - accuracy: 0.2015 - val_loss: 3.8003 - val_accuracy: 0.2314
Epoch 64/200
266/266 [==============================] - 1s 4ms/step - loss: 3.8786 - accuracy: 0.1957 - val_loss: 3.8001 - val_accuracy: 0.2294
Epoch 65/200
266/266 [==============================] - 1s 4ms/step - loss: 3.8465 - accuracy: 0.2067 - val_loss: 3.7912 - val_accuracy: 0.2307
Epoch 66/200
266/266 [==============================] - 1s 4ms/step - loss: 3.8903 - accuracy: 0.1939 - val_loss: 3.7909 - val_accuracy: 0.2301
Epoch 67/200
266/266 [==============================] - 1s 4ms/step - loss: 3.8736 - accuracy: 0.1920 - val_loss: 3.7809 - val_accuracy: 0.2327
Epoch 68/200
266/266 [==============================] - 1s 4ms/step - loss: 3.8580 - accuracy: 0.1969 - val_loss: 3.7824 - val_accuracy: 0.2340
Epoch 69/200
266/266 [==============================] - 1s 4ms/step - loss: 3.8594 - accuracy: 0.2035 - val_loss: 3.7719 - val_accuracy: 0.2367
Epoch 70/200
266/266 [==============================] - 1s 4ms/step - loss: 3.8455 - accuracy: 0.2137 - val_loss: 3.7687 - val_accuracy: 0.2387
Epoch 71/200
266/266 [==============================] - 1s 4ms/step - loss: 3.8309 - accuracy: 0.2192 - val_loss: 3.7681 - val_accuracy: 0.2400
Epoch 72/200
266/266 [==============================] - 1s 4ms/step - loss: 3.8365 - accuracy: 0.2125 - val_loss: 3.7672 - val_accuracy: 0.2380
Epoch 73/200
266/266 [==============================] - 1s 4ms/step - loss: 3.8463 - accuracy: 0.2091 - val_loss: 3.7588 - val_accuracy: 0.2407
Epoch 74/200
266/266 [==============================] - 1s 4ms/step - loss: 3.8322 - accuracy: 0.2127 - val_loss: 3.7515 - val_accuracy: 0.2427
Epoch 75/200
266/266 [==============================] - 1s 4ms/step - loss: 3.8221 - accuracy: 0.2205 - val_loss: 3.7496 - val_accuracy: 0.2434
Epoch 76/200
266/266 [==============================] - 1s 4ms/step - loss: 3.8327 - accuracy: 0.2104 - val_loss: 3.7466 - val_accuracy: 0.2440
Epoch 77/200
266/266 [==============================] - 1s 4ms/step - loss: 3.8091 - accuracy: 0.2137 - val_loss: 3.7384 - val_accuracy: 0.2447
Epoch 78/200
266/266 [==============================] - 1s 4ms/step - loss: 3.8180 - accuracy: 0.2133 - val_loss: 3.7390 - val_accuracy: 0.2447
Epoch 79/200
266/266 [==============================] - 1s 4ms/step - loss: 3.7968 - accuracy: 0.2160 - val_loss: 3.7293 - val_accuracy: 0.2467
Epoch 80/200
266/266 [==============================] - 1s 4ms/step - loss: 3.7897 - accuracy: 0.2262 - val_loss: 3.7277 - val_accuracy: 0.2467
Epoch 81/200
266/266 [==============================] - 1s 4ms/step - loss: 3.7996 - accuracy: 0.2169 - val_loss: 3.7252 - val_accuracy: 0.2487
Epoch 82/200
266/266 [==============================] - 1s 4ms/step - loss: 3.7977 - accuracy: 0.2128 - val_loss: 3.7179 - val_accuracy: 0.2487
Epoch 83/200
266/266 [==============================] - 1s 4ms/step - loss: 3.7827 - accuracy: 0.2229 - val_loss: 3.7139 - val_accuracy: 0.2480
Epoch 84/200
266/266 [==============================] - 1s 4ms/step - loss: 3.7645 - accuracy: 0.2296 - val_loss: 3.7133 - val_accuracy: 0.2473
Epoch 85/200
266/266 [==============================] - 1s 4ms/step - loss: 3.7890 - accuracy: 0.2225 - val_loss: 3.7113 - val_accuracy: 0.2493
Epoch 86/200
266/266 [==============================] - 1s 4ms/step - loss: 3.7876 - accuracy: 0.2127 - val_loss: 3.7052 - val_accuracy: 0.2513
Epoch 87/200
266/266 [==============================] - 1s 4ms/step - loss: 3.7655 - accuracy: 0.2264 - val_loss: 3.6963 - val_accuracy: 0.2540
Epoch 88/200
266/266 [==============================] - 1s 4ms/step - loss: 3.7527 - accuracy: 0.2284 - val_loss: 3.6968 - val_accuracy: 0.2566
Epoch 89/200
266/266 [==============================] - 1s 4ms/step - loss: 3.7620 - accuracy: 0.2391 - val_loss: 3.6937 - val_accuracy: 0.2533
Epoch 90/200
266/266 [==============================] - 1s 4ms/step - loss: 3.7569 - accuracy: 0.2360 - val_loss: 3.6895 - val_accuracy: 0.2540
Epoch 91/200
266/266 [==============================] - 1s 4ms/step - loss: 3.7540 - accuracy: 0.2307 - val_loss: 3.6877 - val_accuracy: 0.2540
Epoch 92/200
266/266 [==============================] - 1s 4ms/step - loss: 3.7377 - accuracy: 0.2359 - val_loss: 3.6792 - val_accuracy: 0.2586
Epoch 93/200
266/266 [==============================] - 1s 4ms/step - loss: 3.7450 - accuracy: 0.2257 - val_loss: 3.6739 - val_accuracy: 0.2600
Epoch 94/200
266/266 [==============================] - 1s 4ms/step - loss: 3.7451 - accuracy: 0.2300 - val_loss: 3.6667 - val_accuracy: 0.2640
Epoch 95/200
266/266 [==============================] - 1s 4ms/step - loss: 3.7468 - accuracy: 0.2332 - val_loss: 3.6677 - val_accuracy: 0.2620
Epoch 96/200
266/266 [==============================] - 1s 4ms/step - loss: 3.7165 - accuracy: 0.2440 - val_loss: 3.6651 - val_accuracy: 0.2580
Epoch 97/200
266/266 [==============================] - 1s 4ms/step - loss: 3.7382 - accuracy: 0.2324 - val_loss: 3.6598 - val_accuracy: 0.2660
Epoch 98/200
266/266 [==============================] - 1s 4ms/step - loss: 3.7179 - accuracy: 0.2434 - val_loss: 3.6571 - val_accuracy: 0.2640
Epoch 99/200
266/266 [==============================] - 1s 4ms/step - loss: 3.7169 - accuracy: 0.2387 - val_loss: 3.6487 - val_accuracy: 0.2680
Epoch 100/200
266/266 [==============================] - 1s 4ms/step - loss: 3.7098 - accuracy: 0.2392 - val_loss: 3.6442 - val_accuracy: 0.2706
Epoch 101/200
266/266 [==============================] - 1s 4ms/step - loss: 3.6952 - accuracy: 0.2409 - val_loss: 3.6454 - val_accuracy: 0.2693
Epoch 102/200
266/266 [==============================] - 1s 4ms/step - loss: 3.7037 - accuracy: 0.2431 - val_loss: 3.6344 - val_accuracy: 0.2719
Epoch 103/200
266/266 [==============================] - 1s 4ms/step - loss: 3.6937 - accuracy: 0.2465 - val_loss: 3.6368 - val_accuracy: 0.2713
Epoch 104/200
266/266 [==============================] - 1s 4ms/step - loss: 3.6840 - accuracy: 0.2517 - val_loss: 3.6268 - val_accuracy: 0.2739
Epoch 105/200
266/266 [==============================] - 1s 4ms/step - loss: 3.6865 - accuracy: 0.2487 - val_loss: 3.6250 - val_accuracy: 0.2746
Epoch 106/200
266/266 [==============================] - 1s 4ms/step - loss: 3.6978 - accuracy: 0.2395 - val_loss: 3.6187 - val_accuracy: 0.2753
Epoch 107/200
266/266 [==============================] - 1s 4ms/step - loss: 3.7073 - accuracy: 0.2446 - val_loss: 3.6164 - val_accuracy: 0.2779
Epoch 108/200
266/266 [==============================] - 1s 4ms/step - loss: 3.6886 - accuracy: 0.2362 - val_loss: 3.6131 - val_accuracy: 0.2779
Epoch 109/200
266/266 [==============================] - 1s 4ms/step - loss: 3.6580 - accuracy: 0.2563 - val_loss: 3.6070 - val_accuracy: 0.2766
Epoch 110/200
266/266 [==============================] - 1s 4ms/step - loss: 3.6710 - accuracy: 0.2480 - val_loss: 3.6047 - val_accuracy: 0.2779
Epoch 111/200
266/266 [==============================] - 1s 4ms/step - loss: 3.6638 - accuracy: 0.2475 - val_loss: 3.5926 - val_accuracy: 0.2839
Epoch 112/200
266/266 [==============================] - 1s 4ms/step - loss: 3.6532 - accuracy: 0.2432 - val_loss: 3.5947 - val_accuracy: 0.2793
Epoch 113/200
266/266 [==============================] - 1s 4ms/step - loss: 3.6622 - accuracy: 0.2525 - val_loss: 3.5960 - val_accuracy: 0.2799
Epoch 114/200
266/266 [==============================] - 1s 4ms/step - loss: 3.6498 - accuracy: 0.2484 - val_loss: 3.5826 - val_accuracy: 0.2846
Epoch 115/200
266/266 [==============================] - 1s 4ms/step - loss: 3.6612 - accuracy: 0.2438 - val_loss: 3.5867 - val_accuracy: 0.2839
Epoch 116/200
266/266 [==============================] - 1s 4ms/step - loss: 3.6572 - accuracy: 0.2398 - val_loss: 3.5832 - val_accuracy: 0.2832
Epoch 117/200
266/266 [==============================] - 1s 4ms/step - loss: 3.6447 - accuracy: 0.2622 - val_loss: 3.5752 - val_accuracy: 0.2866
Epoch 118/200
266/266 [==============================] - 1s 4ms/step - loss: 3.6517 - accuracy: 0.2380 - val_loss: 3.5730 - val_accuracy: 0.2892
Epoch 119/200
266/266 [==============================] - 1s 4ms/step - loss: 3.6491 - accuracy: 0.2520 - val_loss: 3.5632 - val_accuracy: 0.2906
Epoch 120/200
266/266 [==============================] - 1s 4ms/step - loss: 3.6492 - accuracy: 0.2476 - val_loss: 3.5662 - val_accuracy: 0.2892
Epoch 121/200
266/266 [==============================] - 1s 4ms/step - loss: 3.6344 - accuracy: 0.2556 - val_loss: 3.5600 - val_accuracy: 0.2899
Epoch 122/200
266/266 [==============================] - 1s 4ms/step - loss: 3.6439 - accuracy: 0.2435 - val_loss: 3.5571 - val_accuracy: 0.2919
Epoch 123/200
266/266 [==============================] - 1s 4ms/step - loss: 3.6234 - accuracy: 0.2560 - val_loss: 3.5486 - val_accuracy: 0.2932
Epoch 124/200
266/266 [==============================] - 1s 4ms/step - loss: 3.6306 - accuracy: 0.2553 - val_loss: 3.5481 - val_accuracy: 0.2926
Epoch 125/200
266/266 [==============================] - 1s 4ms/step - loss: 3.5984 - accuracy: 0.2631 - val_loss: 3.5418 - val_accuracy: 0.2945
Epoch 126/200
266/266 [==============================] - 1s 4ms/step - loss: 3.6166 - accuracy: 0.2537 - val_loss: 3.5385 - val_accuracy: 0.2945
Epoch 127/200
266/266 [==============================] - 1s 4ms/step - loss: 3.6252 - accuracy: 0.2544 - val_loss: 3.5357 - val_accuracy: 0.2959
Epoch 128/200
266/266 [==============================] - 1s 5ms/step - loss: 3.6052 - accuracy: 0.2565 - val_loss: 3.5249 - val_accuracy: 0.2965
Epoch 129/200
266/266 [==============================] - 1s 4ms/step - loss: 3.5952 - accuracy: 0.2589 - val_loss: 3.5265 - val_accuracy: 0.2999
Epoch 130/200
266/266 [==============================] - 1s 4ms/step - loss: 3.5967 - accuracy: 0.2632 - val_loss: 3.5229 - val_accuracy: 0.2972
Epoch 131/200
266/266 [==============================] - 1s 4ms/step - loss: 3.5817 - accuracy: 0.2734 - val_loss: 3.5177 - val_accuracy: 0.2985
Epoch 132/200
266/266 [==============================] - 1s 4ms/step - loss: 3.5713 - accuracy: 0.2693 - val_loss: 3.5153 - val_accuracy: 0.2985
Epoch 133/200
266/266 [==============================] - 1s 4ms/step - loss: 3.5925 - accuracy: 0.2637 - val_loss: 3.5115 - val_accuracy: 0.3012
Epoch 134/200
266/266 [==============================] - 1s 4ms/step - loss: 3.5771 - accuracy: 0.2678 - val_loss: 3.5122 - val_accuracy: 0.3019
Epoch 135/200
266/266 [==============================] - 1s 4ms/step - loss: 3.5891 - accuracy: 0.2614 - val_loss: 3.4963 - val_accuracy: 0.3072
Epoch 136/200
266/266 [==============================] - 1s 4ms/step - loss: 3.5813 - accuracy: 0.2573 - val_loss: 3.5029 - val_accuracy: 0.3019
Epoch 137/200
266/266 [==============================] - 1s 5ms/step - loss: 3.5778 - accuracy: 0.2598 - val_loss: 3.4969 - val_accuracy: 0.3078
Epoch 138/200
266/266 [==============================] - 1s 4ms/step - loss: 3.5580 - accuracy: 0.2686 - val_loss: 3.4906 - val_accuracy: 0.3085
Epoch 139/200
266/266 [==============================] - 1s 4ms/step - loss: 3.5706 - accuracy: 0.2702 - val_loss: 3.4883 - val_accuracy: 0.3098
Epoch 140/200
266/266 [==============================] - 1s 4ms/step - loss: 3.5501 - accuracy: 0.2745 - val_loss: 3.4866 - val_accuracy: 0.3112
Epoch 141/200
266/266 [==============================] - 1s 4ms/step - loss: 3.5667 - accuracy: 0.2648 - val_loss: 3.4773 - val_accuracy: 0.3125
Epoch 142/200
266/266 [==============================] - 1s 4ms/step - loss: 3.5623 - accuracy: 0.2696 - val_loss: 3.4787 - val_accuracy: 0.3118
Epoch 143/200
266/266 [==============================] - 1s 4ms/step - loss: 3.5541 - accuracy: 0.2711 - val_loss: 3.4695 - val_accuracy: 0.3138
Epoch 144/200
266/266 [==============================] - 1s 4ms/step - loss: 3.5523 - accuracy: 0.2820 - val_loss: 3.4654 - val_accuracy: 0.3145
Epoch 145/200
266/266 [==============================] - 1s 4ms/step - loss: 3.5430 - accuracy: 0.2759 - val_loss: 3.4638 - val_accuracy: 0.3145
Epoch 146/200
266/266 [==============================] - 1s 4ms/step - loss: 3.5213 - accuracy: 0.2839 - val_loss: 3.4552 - val_accuracy: 0.3165
Epoch 147/200
266/266 [==============================] - 1s 4ms/step - loss: 3.5226 - accuracy: 0.2815 - val_loss: 3.4567 - val_accuracy: 0.3172
Epoch 148/200
266/266 [==============================] - 1s 4ms/step - loss: 3.5171 - accuracy: 0.2781 - val_loss: 3.4573 - val_accuracy: 0.3138
Epoch 149/200
266/266 [==============================] - 1s 4ms/step - loss: 3.5299 - accuracy: 0.2694 - val_loss: 3.4552 - val_accuracy: 0.3211
Epoch 150/200
266/266 [==============================] - 1s 4ms/step - loss: 3.5008 - accuracy: 0.2823 - val_loss: 3.4474 - val_accuracy: 0.3165
Epoch 151/200
266/266 [==============================] - 1s 4ms/step - loss: 3.5231 - accuracy: 0.2811 - val_loss: 3.4366 - val_accuracy: 0.3231
Epoch 152/200
266/266 [==============================] - 1s 4ms/step - loss: 3.5218 - accuracy: 0.2776 - val_loss: 3.4388 - val_accuracy: 0.3205
Epoch 153/200
266/266 [==============================] - 1s 4ms/step - loss: 3.5007 - accuracy: 0.2869 - val_loss: 3.4303 - val_accuracy: 0.3225
Epoch 154/200
266/266 [==============================] - 1s 4ms/step - loss: 3.5148 - accuracy: 0.2775 - val_loss: 3.4286 - val_accuracy: 0.3225
Epoch 155/200
266/266 [==============================] - 1s 4ms/step - loss: 3.5175 - accuracy: 0.2723 - val_loss: 3.4232 - val_accuracy: 0.3271
Epoch 156/200
266/266 [==============================] - 1s 4ms/step - loss: 3.4977 - accuracy: 0.2883 - val_loss: 3.4204 - val_accuracy: 0.3238
Epoch 157/200
266/266 [==============================] - 1s 4ms/step - loss: 3.4919 - accuracy: 0.2852 - val_loss: 3.4155 - val_accuracy: 0.3265
Epoch 158/200
266/266 [==============================] - 1s 4ms/step - loss: 3.4945 - accuracy: 0.2883 - val_loss: 3.4036 - val_accuracy: 0.3278
Epoch 159/200
266/266 [==============================] - 1s 4ms/step - loss: 3.4935 - accuracy: 0.2885 - val_loss: 3.4106 - val_accuracy: 0.3265
Epoch 160/200
266/266 [==============================] - 1s 4ms/step - loss: 3.4815 - accuracy: 0.2876 - val_loss: 3.4009 - val_accuracy: 0.3258
Epoch 161/200
266/266 [==============================] - 1s 4ms/step - loss: 3.4666 - accuracy: 0.2919 - val_loss: 3.3972 - val_accuracy: 0.3265
Epoch 162/200
266/266 [==============================] - 1s 4ms/step - loss: 3.4868 - accuracy: 0.2839 - val_loss: 3.3921 - val_accuracy: 0.3311
Epoch 163/200
266/266 [==============================] - 1s 4ms/step - loss: 3.4471 - accuracy: 0.3006 - val_loss: 3.3917 - val_accuracy: 0.3324
Epoch 164/200
266/266 [==============================] - 1s 4ms/step - loss: 3.4830 - accuracy: 0.2843 - val_loss: 3.3869 - val_accuracy: 0.3338
Epoch 165/200
266/266 [==============================] - 1s 4ms/step - loss: 3.4701 - accuracy: 0.2879 - val_loss: 3.3816 - val_accuracy: 0.3285
Epoch 166/200
266/266 [==============================] - 1s 4ms/step - loss: 3.4538 - accuracy: 0.2899 - val_loss: 3.3800 - val_accuracy: 0.3344
Epoch 167/200
266/266 [==============================] - 1s 4ms/step - loss: 3.4499 - accuracy: 0.2925 - val_loss: 3.3769 - val_accuracy: 0.3344
Epoch 168/200
266/266 [==============================] - 1s 4ms/step - loss: 3.4626 - accuracy: 0.2851 - val_loss: 3.3749 - val_accuracy: 0.3371
Epoch 169/200
266/266 [==============================] - 1s 4ms/step - loss: 3.4439 - accuracy: 0.3025 - val_loss: 3.3627 - val_accuracy: 0.3358
Epoch 170/200
266/266 [==============================] - 1s 4ms/step - loss: 3.4530 - accuracy: 0.2941 - val_loss: 3.3685 - val_accuracy: 0.3384
Epoch 171/200
266/266 [==============================] - 1s 4ms/step - loss: 3.4316 - accuracy: 0.2948 - val_loss: 3.3672 - val_accuracy: 0.3378
Epoch 172/200
266/266 [==============================] - 1s 4ms/step - loss: 3.4201 - accuracy: 0.2945 - val_loss: 3.3589 - val_accuracy: 0.3378
Epoch 173/200
266/266 [==============================] - 1s 4ms/step - loss: 3.4362 - accuracy: 0.2915 - val_loss: 3.3553 - val_accuracy: 0.3398
Epoch 174/200
266/266 [==============================] - 1s 4ms/step - loss: 3.4194 - accuracy: 0.3001 - val_loss: 3.3486 - val_accuracy: 0.3391
Epoch 175/200
266/266 [==============================] - 1s 4ms/step - loss: 3.4176 - accuracy: 0.3070 - val_loss: 3.3450 - val_accuracy: 0.3404
Epoch 176/200
266/266 [==============================] - 1s 4ms/step - loss: 3.4221 - accuracy: 0.2985 - val_loss: 3.3426 - val_accuracy: 0.3438
Epoch 177/200
266/266 [==============================] - 1s 4ms/step - loss: 3.4059 - accuracy: 0.3026 - val_loss: 3.3426 - val_accuracy: 0.3438
Epoch 178/200
266/266 [==============================] - 1s 4ms/step - loss: 3.4118 - accuracy: 0.3032 - val_loss: 3.3332 - val_accuracy: 0.3424
Epoch 179/200
266/266 [==============================] - 1s 4ms/step - loss: 3.4108 - accuracy: 0.3006 - val_loss: 3.3337 - val_accuracy: 0.3418
Epoch 180/200
266/266 [==============================] - 1s 4ms/step - loss: 3.4118 - accuracy: 0.2995 - val_loss: 3.3257 - val_accuracy: 0.3471
Epoch 181/200
266/266 [==============================] - 1s 4ms/step - loss: 3.3986 - accuracy: 0.3048 - val_loss: 3.3222 - val_accuracy: 0.3464
Epoch 182/200
266/266 [==============================] - 1s 4ms/step - loss: 3.3962 - accuracy: 0.3089 - val_loss: 3.3190 - val_accuracy: 0.3471
Epoch 183/200
266/266 [==============================] - 1s 4ms/step - loss: 3.3919 - accuracy: 0.3136 - val_loss: 3.3146 - val_accuracy: 0.3497
Epoch 184/200
266/266 [==============================] - 1s 4ms/step - loss: 3.3941 - accuracy: 0.3098 - val_loss: 3.3120 - val_accuracy: 0.3451
Epoch 185/200
266/266 [==============================] - 1s 4ms/step - loss: 3.3809 - accuracy: 0.3114 - val_loss: 3.3133 - val_accuracy: 0.3491
Epoch 186/200
266/266 [==============================] - 1s 4ms/step - loss: 3.3804 - accuracy: 0.3012 - val_loss: 3.3064 - val_accuracy: 0.3457
Epoch 187/200
266/266 [==============================] - 1s 4ms/step - loss: 3.3684 - accuracy: 0.3104 - val_loss: 3.3012 - val_accuracy: 0.3537
Epoch 188/200
266/266 [==============================] - 1s 4ms/step - loss: 3.3801 - accuracy: 0.3040 - val_loss: 3.2995 - val_accuracy: 0.3517
Epoch 189/200
266/266 [==============================] - 1s 4ms/step - loss: 3.3899 - accuracy: 0.3057 - val_loss: 3.2902 - val_accuracy: 0.3551
Epoch 190/200
266/266 [==============================] - 1s 4ms/step - loss: 3.3770 - accuracy: 0.3142 - val_loss: 3.2938 - val_accuracy: 0.3504
Epoch 191/200
266/266 [==============================] - 1s 4ms/step - loss: 3.3748 - accuracy: 0.3172 - val_loss: 3.2887 - val_accuracy: 0.3524
Epoch 192/200
266/266 [==============================] - 1s 4ms/step - loss: 3.3593 - accuracy: 0.3159 - val_loss: 3.2792 - val_accuracy: 0.3531
Epoch 193/200
266/266 [==============================] - 1s 4ms/step - loss: 3.3569 - accuracy: 0.3077 - val_loss: 3.2781 - val_accuracy: 0.3537
Epoch 194/200
266/266 [==============================] - 1s 4ms/step - loss: 3.3531 - accuracy: 0.3116 - val_loss: 3.2797 - val_accuracy: 0.3531
Epoch 195/200
266/266 [==============================] - 1s 4ms/step - loss: 3.3415 - accuracy: 0.3153 - val_loss: 3.2671 - val_accuracy: 0.3597
Epoch 196/200
266/266 [==============================] - 1s 4ms/step - loss: 3.3642 - accuracy: 0.3107 - val_loss: 3.2757 - val_accuracy: 0.3531
Epoch 197/200
266/266 [==============================] - 1s 4ms/step - loss: 3.3472 - accuracy: 0.3196 - val_loss: 3.2627 - val_accuracy: 0.3564
Epoch 198/200
266/266 [==============================] - 1s 4ms/step - loss: 3.3348 - accuracy: 0.3252 - val_loss: 3.2604 - val_accuracy: 0.3551
Epoch 199/200
266/266 [==============================] - 1s 4ms/step - loss: 3.3374 - accuracy: 0.3162 - val_loss: 3.2580 - val_accuracy: 0.3590
Epoch 200/200
266/266 [==============================] - 1s 4ms/step - loss: 3.3412 - accuracy: 0.3169 - val_loss: 3.2487 - val_accuracy: 0.3590
In [ ]:
_, accuracy = model_report(CNN1_MODEL_OPTIMIZED, CNN1_MODEL_OPTIMIZED_history)
accuracies_opt_SGD["CNN1"] = accuracy
Test set evaluation metrics
---------------------------
Loss:     3.270
Accuracy: 35.069%
CNN2
In [ ]:
CNN2_MODEL_OPTIMIZED = init_cnn2_model_optimized(summary = True, optimizer = tf.optimizers.SGD)
CNN2_MODEL_OPTIMIZED_history = train_model(CNN2_MODEL_OPTIMIZED, epochs = 200, callbacks=[callback])
Model: "sequential_22"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
conv2d_26 (Conv2D)           (None, 32, 32, 32)        896       
_________________________________________________________________
batch_normalization_16 (Batc (None, 32, 32, 32)        128       
_________________________________________________________________
re_lu_16 (ReLU)              (None, 32, 32, 32)        0         
_________________________________________________________________
max_pooling2d_18 (MaxPooling (None, 16, 16, 32)        0         
_________________________________________________________________
dropout_33 (Dropout)         (None, 16, 16, 32)        0         
_________________________________________________________________
conv2d_27 (Conv2D)           (None, 16, 16, 64)        18496     
_________________________________________________________________
batch_normalization_17 (Batc (None, 16, 16, 64)        256       
_________________________________________________________________
re_lu_17 (ReLU)              (None, 16, 16, 64)        0         
_________________________________________________________________
max_pooling2d_19 (MaxPooling (None, 8, 8, 64)          0         
_________________________________________________________________
dropout_34 (Dropout)         (None, 8, 8, 64)          0         
_________________________________________________________________
conv2d_28 (Conv2D)           (None, 8, 8, 128)         73856     
_________________________________________________________________
batch_normalization_18 (Batc (None, 8, 8, 128)         512       
_________________________________________________________________
re_lu_18 (ReLU)              (None, 8, 8, 128)         0         
_________________________________________________________________
max_pooling2d_20 (MaxPooling (None, 4, 4, 128)         0         
_________________________________________________________________
dropout_35 (Dropout)         (None, 4, 4, 128)         0         
_________________________________________________________________
conv2d_29 (Conv2D)           (None, 4, 4, 256)         295168    
_________________________________________________________________
batch_normalization_19 (Batc (None, 4, 4, 256)         1024      
_________________________________________________________________
re_lu_19 (ReLU)              (None, 4, 4, 256)         0         
_________________________________________________________________
dropout_36 (Dropout)         (None, 4, 4, 256)         0         
_________________________________________________________________
flatten_8 (Flatten)          (None, 4096)              0         
_________________________________________________________________
dense_31 (Dense)             (None, 512)               2097664   
_________________________________________________________________
dropout_37 (Dropout)         (None, 512)               0         
_________________________________________________________________
dense_32 (Dense)             (None, 20)                10260     
=================================================================
Total params: 2,498,260
Trainable params: 2,497,300
Non-trainable params: 960
_________________________________________________________________
Epoch 1/200
266/266 [==============================] - 2s 6ms/step - loss: 6.5689 - accuracy: 0.0533 - val_loss: 6.0183 - val_accuracy: 0.0572
Epoch 2/200
266/266 [==============================] - 1s 5ms/step - loss: 6.3806 - accuracy: 0.0518 - val_loss: 5.9579 - val_accuracy: 0.0572
Epoch 3/200
266/266 [==============================] - 1s 5ms/step - loss: 6.2628 - accuracy: 0.0687 - val_loss: 5.8836 - val_accuracy: 0.0924
Epoch 4/200
266/266 [==============================] - 1s 5ms/step - loss: 6.2068 - accuracy: 0.0681 - val_loss: 5.8462 - val_accuracy: 0.1130
Epoch 5/200
266/266 [==============================] - 1s 5ms/step - loss: 6.1522 - accuracy: 0.0816 - val_loss: 5.8245 - val_accuracy: 0.1263
Epoch 6/200
266/266 [==============================] - 1s 5ms/step - loss: 6.0841 - accuracy: 0.0905 - val_loss: 5.8028 - val_accuracy: 0.1390
Epoch 7/200
266/266 [==============================] - 1s 5ms/step - loss: 6.0302 - accuracy: 0.1007 - val_loss: 5.7829 - val_accuracy: 0.1503
Epoch 8/200
266/266 [==============================] - 1s 5ms/step - loss: 5.9934 - accuracy: 0.0986 - val_loss: 5.7652 - val_accuracy: 0.1549
Epoch 9/200
266/266 [==============================] - 1s 5ms/step - loss: 5.9720 - accuracy: 0.1056 - val_loss: 5.7528 - val_accuracy: 0.1549
Epoch 10/200
266/266 [==============================] - 1s 5ms/step - loss: 5.9272 - accuracy: 0.1169 - val_loss: 5.7409 - val_accuracy: 0.1589
Epoch 11/200
266/266 [==============================] - 1s 5ms/step - loss: 5.9097 - accuracy: 0.1111 - val_loss: 5.7247 - val_accuracy: 0.1669
Epoch 12/200
266/266 [==============================] - 1s 5ms/step - loss: 5.8823 - accuracy: 0.1226 - val_loss: 5.7135 - val_accuracy: 0.1682
Epoch 13/200
266/266 [==============================] - 1s 5ms/step - loss: 5.8497 - accuracy: 0.1242 - val_loss: 5.7018 - val_accuracy: 0.1689
Epoch 14/200
266/266 [==============================] - 1s 5ms/step - loss: 5.8392 - accuracy: 0.1290 - val_loss: 5.6909 - val_accuracy: 0.1735
Epoch 15/200
266/266 [==============================] - 1s 5ms/step - loss: 5.7958 - accuracy: 0.1347 - val_loss: 5.6800 - val_accuracy: 0.1742
Epoch 16/200
266/266 [==============================] - 1s 5ms/step - loss: 5.7732 - accuracy: 0.1432 - val_loss: 5.6635 - val_accuracy: 0.1769
Epoch 17/200
266/266 [==============================] - 1s 5ms/step - loss: 5.7592 - accuracy: 0.1495 - val_loss: 5.6619 - val_accuracy: 0.1749
Epoch 18/200
266/266 [==============================] - 1s 5ms/step - loss: 5.7490 - accuracy: 0.1448 - val_loss: 5.6536 - val_accuracy: 0.1742
Epoch 19/200
266/266 [==============================] - 1s 5ms/step - loss: 5.7501 - accuracy: 0.1439 - val_loss: 5.6428 - val_accuracy: 0.1802
Epoch 20/200
266/266 [==============================] - 1s 5ms/step - loss: 5.6963 - accuracy: 0.1563 - val_loss: 5.6356 - val_accuracy: 0.1782
Epoch 21/200
266/266 [==============================] - 1s 5ms/step - loss: 5.6742 - accuracy: 0.1611 - val_loss: 5.6247 - val_accuracy: 0.1835
Epoch 22/200
266/266 [==============================] - 1s 5ms/step - loss: 5.6840 - accuracy: 0.1647 - val_loss: 5.6232 - val_accuracy: 0.1809
Epoch 23/200
266/266 [==============================] - 1s 5ms/step - loss: 5.6614 - accuracy: 0.1616 - val_loss: 5.6164 - val_accuracy: 0.1795
Epoch 24/200
266/266 [==============================] - 1s 5ms/step - loss: 5.6557 - accuracy: 0.1585 - val_loss: 5.6122 - val_accuracy: 0.1789
Epoch 25/200
266/266 [==============================] - 1s 5ms/step - loss: 5.6578 - accuracy: 0.1665 - val_loss: 5.5985 - val_accuracy: 0.1835
Epoch 26/200
266/266 [==============================] - 1s 5ms/step - loss: 5.6229 - accuracy: 0.1788 - val_loss: 5.5934 - val_accuracy: 0.1835
Epoch 27/200
266/266 [==============================] - 1s 5ms/step - loss: 5.6228 - accuracy: 0.1748 - val_loss: 5.5791 - val_accuracy: 0.1882
Epoch 28/200
266/266 [==============================] - 1s 5ms/step - loss: 5.5963 - accuracy: 0.1783 - val_loss: 5.5810 - val_accuracy: 0.1868
Epoch 29/200
266/266 [==============================] - 1s 5ms/step - loss: 5.5684 - accuracy: 0.1901 - val_loss: 5.5692 - val_accuracy: 0.1928
Epoch 30/200
266/266 [==============================] - 1s 5ms/step - loss: 5.6129 - accuracy: 0.1753 - val_loss: 5.5650 - val_accuracy: 0.1902
Epoch 31/200
266/266 [==============================] - 1s 5ms/step - loss: 5.5906 - accuracy: 0.1715 - val_loss: 5.5592 - val_accuracy: 0.1928
Epoch 32/200
266/266 [==============================] - 1s 5ms/step - loss: 5.5617 - accuracy: 0.1828 - val_loss: 5.5457 - val_accuracy: 0.1941
Epoch 33/200
266/266 [==============================] - 1s 5ms/step - loss: 5.5537 - accuracy: 0.1874 - val_loss: 5.5458 - val_accuracy: 0.1948
Epoch 34/200
266/266 [==============================] - 1s 5ms/step - loss: 5.5499 - accuracy: 0.1885 - val_loss: 5.5408 - val_accuracy: 0.1961
Epoch 35/200
266/266 [==============================] - 1s 5ms/step - loss: 5.5542 - accuracy: 0.1916 - val_loss: 5.5405 - val_accuracy: 0.1955
Epoch 36/200
266/266 [==============================] - 1s 5ms/step - loss: 5.5272 - accuracy: 0.1892 - val_loss: 5.5299 - val_accuracy: 0.1975
Epoch 37/200
266/266 [==============================] - 1s 5ms/step - loss: 5.5091 - accuracy: 0.1905 - val_loss: 5.5180 - val_accuracy: 0.2015
Epoch 38/200
266/266 [==============================] - 1s 5ms/step - loss: 5.5123 - accuracy: 0.1949 - val_loss: 5.5060 - val_accuracy: 0.2028
Epoch 39/200
266/266 [==============================] - 1s 5ms/step - loss: 5.4951 - accuracy: 0.2000 - val_loss: 5.5058 - val_accuracy: 0.2028
Epoch 40/200
266/266 [==============================] - 1s 5ms/step - loss: 5.5098 - accuracy: 0.2030 - val_loss: 5.4961 - val_accuracy: 0.2028
Epoch 41/200
266/266 [==============================] - 1s 5ms/step - loss: 5.4856 - accuracy: 0.1959 - val_loss: 5.4903 - val_accuracy: 0.2041
Epoch 42/200
266/266 [==============================] - 1s 5ms/step - loss: 5.4523 - accuracy: 0.1981 - val_loss: 5.4862 - val_accuracy: 0.2068
Epoch 43/200
266/266 [==============================] - 1s 5ms/step - loss: 5.4560 - accuracy: 0.1977 - val_loss: 5.4705 - val_accuracy: 0.2134
Epoch 44/200
266/266 [==============================] - 1s 5ms/step - loss: 5.4527 - accuracy: 0.2022 - val_loss: 5.4683 - val_accuracy: 0.2134
Epoch 45/200
266/266 [==============================] - 1s 5ms/step - loss: 5.4322 - accuracy: 0.2025 - val_loss: 5.4695 - val_accuracy: 0.2094
Epoch 46/200
266/266 [==============================] - 1s 5ms/step - loss: 5.4503 - accuracy: 0.2101 - val_loss: 5.4621 - val_accuracy: 0.2148
Epoch 47/200
266/266 [==============================] - 1s 5ms/step - loss: 5.4241 - accuracy: 0.2089 - val_loss: 5.4499 - val_accuracy: 0.2174
Epoch 48/200
266/266 [==============================] - 1s 5ms/step - loss: 5.4313 - accuracy: 0.2076 - val_loss: 5.4405 - val_accuracy: 0.2181
Epoch 49/200
266/266 [==============================] - 1s 5ms/step - loss: 5.3879 - accuracy: 0.2160 - val_loss: 5.4353 - val_accuracy: 0.2181
Epoch 50/200
266/266 [==============================] - 1s 5ms/step - loss: 5.3915 - accuracy: 0.2191 - val_loss: 5.4299 - val_accuracy: 0.2207
Epoch 51/200
266/266 [==============================] - 1s 5ms/step - loss: 5.3786 - accuracy: 0.2216 - val_loss: 5.4234 - val_accuracy: 0.2221
Epoch 52/200
266/266 [==============================] - 1s 5ms/step - loss: 5.4079 - accuracy: 0.2132 - val_loss: 5.4202 - val_accuracy: 0.2221
Epoch 53/200
266/266 [==============================] - 1s 5ms/step - loss: 5.3949 - accuracy: 0.2114 - val_loss: 5.4128 - val_accuracy: 0.2267
Epoch 54/200
266/266 [==============================] - 1s 5ms/step - loss: 5.3663 - accuracy: 0.2143 - val_loss: 5.4123 - val_accuracy: 0.2234
Epoch 55/200
266/266 [==============================] - 1s 5ms/step - loss: 5.3649 - accuracy: 0.2234 - val_loss: 5.3928 - val_accuracy: 0.2320
Epoch 56/200
266/266 [==============================] - 1s 5ms/step - loss: 5.3579 - accuracy: 0.2223 - val_loss: 5.3894 - val_accuracy: 0.2307
Epoch 57/200
266/266 [==============================] - 1s 5ms/step - loss: 5.3518 - accuracy: 0.2264 - val_loss: 5.3710 - val_accuracy: 0.2340
Epoch 58/200
266/266 [==============================] - 1s 5ms/step - loss: 5.3478 - accuracy: 0.2311 - val_loss: 5.3828 - val_accuracy: 0.2327
Epoch 59/200
266/266 [==============================] - 1s 5ms/step - loss: 5.3340 - accuracy: 0.2222 - val_loss: 5.3710 - val_accuracy: 0.2347
Epoch 60/200
266/266 [==============================] - 1s 5ms/step - loss: 5.3471 - accuracy: 0.2285 - val_loss: 5.3756 - val_accuracy: 0.2320
Epoch 61/200
266/266 [==============================] - 1s 5ms/step - loss: 5.3272 - accuracy: 0.2241 - val_loss: 5.3567 - val_accuracy: 0.2380
Epoch 62/200
266/266 [==============================] - 1s 5ms/step - loss: 5.3110 - accuracy: 0.2268 - val_loss: 5.3536 - val_accuracy: 0.2387
Epoch 63/200
266/266 [==============================] - 1s 5ms/step - loss: 5.3074 - accuracy: 0.2373 - val_loss: 5.3427 - val_accuracy: 0.2420
Epoch 64/200
266/266 [==============================] - 1s 5ms/step - loss: 5.3293 - accuracy: 0.2240 - val_loss: 5.3433 - val_accuracy: 0.2367
Epoch 65/200
266/266 [==============================] - 1s 5ms/step - loss: 5.3044 - accuracy: 0.2370 - val_loss: 5.3216 - val_accuracy: 0.2420
Epoch 66/200
266/266 [==============================] - 1s 5ms/step - loss: 5.2723 - accuracy: 0.2353 - val_loss: 5.3195 - val_accuracy: 0.2440
Epoch 67/200
266/266 [==============================] - 1s 5ms/step - loss: 5.2815 - accuracy: 0.2284 - val_loss: 5.3243 - val_accuracy: 0.2394
Epoch 68/200
266/266 [==============================] - 1s 5ms/step - loss: 5.2674 - accuracy: 0.2341 - val_loss: 5.3130 - val_accuracy: 0.2414
Epoch 69/200
266/266 [==============================] - 1s 5ms/step - loss: 5.2462 - accuracy: 0.2392 - val_loss: 5.3132 - val_accuracy: 0.2420
Epoch 70/200
266/266 [==============================] - 1s 5ms/step - loss: 5.2582 - accuracy: 0.2423 - val_loss: 5.3064 - val_accuracy: 0.2460
Epoch 71/200
266/266 [==============================] - 1s 5ms/step - loss: 5.2436 - accuracy: 0.2367 - val_loss: 5.2947 - val_accuracy: 0.2487
Epoch 72/200
266/266 [==============================] - 1s 5ms/step - loss: 5.2328 - accuracy: 0.2475 - val_loss: 5.2905 - val_accuracy: 0.2487
Epoch 73/200
266/266 [==============================] - 1s 5ms/step - loss: 5.2279 - accuracy: 0.2592 - val_loss: 5.2874 - val_accuracy: 0.2460
Epoch 74/200
266/266 [==============================] - 1s 5ms/step - loss: 5.2377 - accuracy: 0.2445 - val_loss: 5.2713 - val_accuracy: 0.2540
Epoch 75/200
266/266 [==============================] - 1s 5ms/step - loss: 5.2193 - accuracy: 0.2444 - val_loss: 5.2660 - val_accuracy: 0.2560
Epoch 76/200
266/266 [==============================] - 1s 5ms/step - loss: 5.1991 - accuracy: 0.2593 - val_loss: 5.2679 - val_accuracy: 0.2527
Epoch 77/200
266/266 [==============================] - 1s 5ms/step - loss: 5.2162 - accuracy: 0.2515 - val_loss: 5.2566 - val_accuracy: 0.2580
Epoch 78/200
266/266 [==============================] - 1s 5ms/step - loss: 5.1998 - accuracy: 0.2550 - val_loss: 5.2440 - val_accuracy: 0.2580
Epoch 79/200
266/266 [==============================] - 1s 5ms/step - loss: 5.1798 - accuracy: 0.2494 - val_loss: 5.2459 - val_accuracy: 0.2566
Epoch 80/200
266/266 [==============================] - 1s 5ms/step - loss: 5.1909 - accuracy: 0.2521 - val_loss: 5.2351 - val_accuracy: 0.2593
Epoch 81/200
266/266 [==============================] - 1s 5ms/step - loss: 5.1773 - accuracy: 0.2536 - val_loss: 5.2291 - val_accuracy: 0.2613
Epoch 82/200
266/266 [==============================] - 1s 5ms/step - loss: 5.1531 - accuracy: 0.2635 - val_loss: 5.2156 - val_accuracy: 0.2673
Epoch 83/200
266/266 [==============================] - 1s 5ms/step - loss: 5.1783 - accuracy: 0.2561 - val_loss: 5.2146 - val_accuracy: 0.2666
Epoch 84/200
266/266 [==============================] - 1s 5ms/step - loss: 5.1823 - accuracy: 0.2500 - val_loss: 5.2043 - val_accuracy: 0.2706
Epoch 85/200
266/266 [==============================] - 1s 5ms/step - loss: 5.1602 - accuracy: 0.2580 - val_loss: 5.1999 - val_accuracy: 0.2699
Epoch 86/200
266/266 [==============================] - 1s 5ms/step - loss: 5.1295 - accuracy: 0.2562 - val_loss: 5.2021 - val_accuracy: 0.2686
Epoch 87/200
266/266 [==============================] - 1s 5ms/step - loss: 5.1355 - accuracy: 0.2618 - val_loss: 5.1898 - val_accuracy: 0.2719
Epoch 88/200
266/266 [==============================] - 1s 5ms/step - loss: 5.1338 - accuracy: 0.2594 - val_loss: 5.1789 - val_accuracy: 0.2753
Epoch 89/200
266/266 [==============================] - 1s 5ms/step - loss: 5.1343 - accuracy: 0.2612 - val_loss: 5.1790 - val_accuracy: 0.2759
Epoch 90/200
266/266 [==============================] - 1s 5ms/step - loss: 5.1086 - accuracy: 0.2656 - val_loss: 5.1642 - val_accuracy: 0.2759
Epoch 91/200
266/266 [==============================] - 1s 5ms/step - loss: 5.1249 - accuracy: 0.2679 - val_loss: 5.1574 - val_accuracy: 0.2773
Epoch 92/200
266/266 [==============================] - 1s 5ms/step - loss: 5.0868 - accuracy: 0.2749 - val_loss: 5.1570 - val_accuracy: 0.2779
Epoch 93/200
266/266 [==============================] - 1s 5ms/step - loss: 5.0722 - accuracy: 0.2752 - val_loss: 5.1480 - val_accuracy: 0.2779
Epoch 94/200
266/266 [==============================] - 1s 5ms/step - loss: 5.0876 - accuracy: 0.2739 - val_loss: 5.1372 - val_accuracy: 0.2779
Epoch 95/200
266/266 [==============================] - 1s 5ms/step - loss: 5.0676 - accuracy: 0.2705 - val_loss: 5.1369 - val_accuracy: 0.2766
Epoch 96/200
266/266 [==============================] - 1s 5ms/step - loss: 5.0961 - accuracy: 0.2712 - val_loss: 5.1242 - val_accuracy: 0.2766
Epoch 97/200
266/266 [==============================] - 1s 5ms/step - loss: 5.0628 - accuracy: 0.2694 - val_loss: 5.1207 - val_accuracy: 0.2793
Epoch 98/200
266/266 [==============================] - 1s 5ms/step - loss: 5.0830 - accuracy: 0.2665 - val_loss: 5.1226 - val_accuracy: 0.2779
Epoch 99/200
266/266 [==============================] - 1s 5ms/step - loss: 5.0608 - accuracy: 0.2810 - val_loss: 5.1025 - val_accuracy: 0.2812
Epoch 100/200
266/266 [==============================] - 1s 5ms/step - loss: 5.0398 - accuracy: 0.2746 - val_loss: 5.1113 - val_accuracy: 0.2773
Epoch 101/200
266/266 [==============================] - 1s 5ms/step - loss: 5.0724 - accuracy: 0.2630 - val_loss: 5.0969 - val_accuracy: 0.2819
Epoch 102/200
266/266 [==============================] - 1s 5ms/step - loss: 5.0462 - accuracy: 0.2836 - val_loss: 5.0950 - val_accuracy: 0.2812
Epoch 103/200
266/266 [==============================] - 1s 5ms/step - loss: 5.0659 - accuracy: 0.2663 - val_loss: 5.0842 - val_accuracy: 0.2839
Epoch 104/200
266/266 [==============================] - 1s 5ms/step - loss: 5.0027 - accuracy: 0.2957 - val_loss: 5.0775 - val_accuracy: 0.2886
Epoch 105/200
266/266 [==============================] - 1s 5ms/step - loss: 5.0299 - accuracy: 0.2902 - val_loss: 5.0764 - val_accuracy: 0.2852
Epoch 106/200
266/266 [==============================] - 1s 5ms/step - loss: 5.0282 - accuracy: 0.2807 - val_loss: 5.0746 - val_accuracy: 0.2872
Epoch 107/200
266/266 [==============================] - 1s 5ms/step - loss: 4.9998 - accuracy: 0.2810 - val_loss: 5.0689 - val_accuracy: 0.2886
Epoch 108/200
266/266 [==============================] - 1s 5ms/step - loss: 4.9956 - accuracy: 0.2857 - val_loss: 5.0604 - val_accuracy: 0.2879
Epoch 109/200
266/266 [==============================] - 1s 5ms/step - loss: 4.9959 - accuracy: 0.2887 - val_loss: 5.0477 - val_accuracy: 0.2906
Epoch 110/200
266/266 [==============================] - 1s 5ms/step - loss: 5.0165 - accuracy: 0.2869 - val_loss: 5.0498 - val_accuracy: 0.2819
Epoch 111/200
266/266 [==============================] - 1s 5ms/step - loss: 4.9855 - accuracy: 0.2852 - val_loss: 5.0289 - val_accuracy: 0.2919
Epoch 112/200
266/266 [==============================] - 1s 5ms/step - loss: 4.9833 - accuracy: 0.2863 - val_loss: 5.0345 - val_accuracy: 0.2912
Epoch 113/200
266/266 [==============================] - 1s 5ms/step - loss: 4.9761 - accuracy: 0.2874 - val_loss: 5.0331 - val_accuracy: 0.2939
Epoch 114/200
266/266 [==============================] - 1s 5ms/step - loss: 4.9751 - accuracy: 0.2866 - val_loss: 5.0235 - val_accuracy: 0.2959
Epoch 115/200
266/266 [==============================] - 1s 5ms/step - loss: 4.9516 - accuracy: 0.2953 - val_loss: 5.0076 - val_accuracy: 0.2992
Epoch 116/200
266/266 [==============================] - 1s 5ms/step - loss: 4.9628 - accuracy: 0.2880 - val_loss: 5.0116 - val_accuracy: 0.2979
Epoch 117/200
266/266 [==============================] - 1s 5ms/step - loss: 4.9612 - accuracy: 0.2917 - val_loss: 5.0075 - val_accuracy: 0.2952
Epoch 118/200
266/266 [==============================] - 1s 5ms/step - loss: 4.9456 - accuracy: 0.2981 - val_loss: 5.0014 - val_accuracy: 0.2992
Epoch 119/200
266/266 [==============================] - 1s 5ms/step - loss: 4.9316 - accuracy: 0.2946 - val_loss: 4.9864 - val_accuracy: 0.3052
Epoch 120/200
266/266 [==============================] - 1s 5ms/step - loss: 4.9284 - accuracy: 0.2974 - val_loss: 4.9739 - val_accuracy: 0.3078
Epoch 121/200
266/266 [==============================] - 1s 5ms/step - loss: 4.9121 - accuracy: 0.2967 - val_loss: 4.9833 - val_accuracy: 0.3019
Epoch 122/200
266/266 [==============================] - 1s 5ms/step - loss: 4.9186 - accuracy: 0.2923 - val_loss: 4.9684 - val_accuracy: 0.3032
Epoch 123/200
266/266 [==============================] - 1s 5ms/step - loss: 4.9153 - accuracy: 0.2983 - val_loss: 4.9672 - val_accuracy: 0.3039
Epoch 124/200
266/266 [==============================] - 1s 5ms/step - loss: 4.9221 - accuracy: 0.2933 - val_loss: 4.9614 - val_accuracy: 0.3032
Epoch 125/200
266/266 [==============================] - 1s 5ms/step - loss: 4.8900 - accuracy: 0.3049 - val_loss: 4.9385 - val_accuracy: 0.3072
Epoch 126/200
266/266 [==============================] - 1s 5ms/step - loss: 4.8934 - accuracy: 0.3026 - val_loss: 4.9614 - val_accuracy: 0.3025
Epoch 127/200
266/266 [==============================] - 1s 5ms/step - loss: 4.9040 - accuracy: 0.2960 - val_loss: 4.9470 - val_accuracy: 0.3025
Epoch 128/200
266/266 [==============================] - 1s 5ms/step - loss: 4.8722 - accuracy: 0.3155 - val_loss: 4.9318 - val_accuracy: 0.3112
Epoch 129/200
266/266 [==============================] - 1s 5ms/step - loss: 4.8780 - accuracy: 0.3020 - val_loss: 4.9348 - val_accuracy: 0.3059
Epoch 130/200
266/266 [==============================] - 1s 5ms/step - loss: 4.8826 - accuracy: 0.3117 - val_loss: 4.9319 - val_accuracy: 0.3059
Epoch 131/200
266/266 [==============================] - 1s 5ms/step - loss: 4.8841 - accuracy: 0.3013 - val_loss: 4.9303 - val_accuracy: 0.3072
Epoch 132/200
266/266 [==============================] - 1s 5ms/step - loss: 4.8384 - accuracy: 0.2999 - val_loss: 4.9046 - val_accuracy: 0.3105
Epoch 133/200
266/266 [==============================] - 1s 5ms/step - loss: 4.8357 - accuracy: 0.3192 - val_loss: 4.9015 - val_accuracy: 0.3118
Epoch 134/200
266/266 [==============================] - 1s 5ms/step - loss: 4.8310 - accuracy: 0.3144 - val_loss: 4.9030 - val_accuracy: 0.3125
Epoch 135/200
266/266 [==============================] - 1s 5ms/step - loss: 4.8497 - accuracy: 0.3114 - val_loss: 4.9017 - val_accuracy: 0.3092
Epoch 136/200
266/266 [==============================] - 1s 5ms/step - loss: 4.8505 - accuracy: 0.3042 - val_loss: 4.8891 - val_accuracy: 0.3125
Epoch 137/200
266/266 [==============================] - 1s 5ms/step - loss: 4.8263 - accuracy: 0.3137 - val_loss: 4.8851 - val_accuracy: 0.3125
Epoch 138/200
266/266 [==============================] - 1s 5ms/step - loss: 4.8209 - accuracy: 0.3198 - val_loss: 4.8797 - val_accuracy: 0.3125
Epoch 139/200
266/266 [==============================] - 1s 5ms/step - loss: 4.8039 - accuracy: 0.3161 - val_loss: 4.8784 - val_accuracy: 0.3118
Epoch 140/200
266/266 [==============================] - 1s 5ms/step - loss: 4.8078 - accuracy: 0.3057 - val_loss: 4.8612 - val_accuracy: 0.3158
Epoch 141/200
266/266 [==============================] - 1s 5ms/step - loss: 4.8130 - accuracy: 0.3132 - val_loss: 4.8672 - val_accuracy: 0.3158
Epoch 142/200
266/266 [==============================] - 1s 5ms/step - loss: 4.8071 - accuracy: 0.3147 - val_loss: 4.8628 - val_accuracy: 0.3158
Epoch 143/200
266/266 [==============================] - 1s 5ms/step - loss: 4.7966 - accuracy: 0.3223 - val_loss: 4.8543 - val_accuracy: 0.3152
Epoch 144/200
266/266 [==============================] - 1s 5ms/step - loss: 4.7926 - accuracy: 0.3195 - val_loss: 4.8423 - val_accuracy: 0.3165
Epoch 145/200
266/266 [==============================] - 1s 5ms/step - loss: 4.7506 - accuracy: 0.3285 - val_loss: 4.8381 - val_accuracy: 0.3205
Epoch 146/200
266/266 [==============================] - 1s 5ms/step - loss: 4.8007 - accuracy: 0.3169 - val_loss: 4.8282 - val_accuracy: 0.3218
Epoch 147/200
266/266 [==============================] - 1s 5ms/step - loss: 4.7801 - accuracy: 0.3160 - val_loss: 4.8564 - val_accuracy: 0.3085
Epoch 148/200
266/266 [==============================] - 1s 5ms/step - loss: 4.7461 - accuracy: 0.3258 - val_loss: 4.8382 - val_accuracy: 0.3158
Epoch 149/200
266/266 [==============================] - 1s 5ms/step - loss: 4.7531 - accuracy: 0.3299 - val_loss: 4.8200 - val_accuracy: 0.3198
Epoch 150/200
266/266 [==============================] - 1s 5ms/step - loss: 4.7338 - accuracy: 0.3309 - val_loss: 4.8217 - val_accuracy: 0.3165
Epoch 151/200
266/266 [==============================] - 1s 5ms/step - loss: 4.7411 - accuracy: 0.3233 - val_loss: 4.7976 - val_accuracy: 0.3238
Epoch 152/200
266/266 [==============================] - 1s 5ms/step - loss: 4.7240 - accuracy: 0.3289 - val_loss: 4.7985 - val_accuracy: 0.3251
Epoch 153/200
266/266 [==============================] - 1s 5ms/step - loss: 4.7264 - accuracy: 0.3311 - val_loss: 4.7987 - val_accuracy: 0.3231
Epoch 154/200
266/266 [==============================] - 1s 5ms/step - loss: 4.7221 - accuracy: 0.3348 - val_loss: 4.8008 - val_accuracy: 0.3185
Epoch 155/200
266/266 [==============================] - 1s 5ms/step - loss: 4.7102 - accuracy: 0.3278 - val_loss: 4.7893 - val_accuracy: 0.3245
Epoch 156/200
266/266 [==============================] - 1s 5ms/step - loss: 4.7207 - accuracy: 0.3375 - val_loss: 4.7885 - val_accuracy: 0.3211
Epoch 157/200
266/266 [==============================] - 1s 5ms/step - loss: 4.7088 - accuracy: 0.3349 - val_loss: 4.7936 - val_accuracy: 0.3185
Epoch 158/200
266/266 [==============================] - 1s 5ms/step - loss: 4.7087 - accuracy: 0.3315 - val_loss: 4.7577 - val_accuracy: 0.3331
Epoch 159/200
266/266 [==============================] - 1s 5ms/step - loss: 4.6882 - accuracy: 0.3341 - val_loss: 4.7662 - val_accuracy: 0.3258
Epoch 160/200
266/266 [==============================] - 1s 5ms/step - loss: 4.7089 - accuracy: 0.3381 - val_loss: 4.7624 - val_accuracy: 0.3211
Epoch 161/200
266/266 [==============================] - 1s 5ms/step - loss: 4.7022 - accuracy: 0.3422 - val_loss: 4.7525 - val_accuracy: 0.3271
Epoch 162/200
266/266 [==============================] - 1s 5ms/step - loss: 4.6786 - accuracy: 0.3426 - val_loss: 4.7448 - val_accuracy: 0.3291
Epoch 163/200
266/266 [==============================] - 1s 5ms/step - loss: 4.6916 - accuracy: 0.3245 - val_loss: 4.7457 - val_accuracy: 0.3311
Epoch 164/200
266/266 [==============================] - 1s 5ms/step - loss: 4.6772 - accuracy: 0.3507 - val_loss: 4.7423 - val_accuracy: 0.3324
Epoch 165/200
266/266 [==============================] - 1s 5ms/step - loss: 4.6735 - accuracy: 0.3473 - val_loss: 4.7405 - val_accuracy: 0.3311
Epoch 166/200
266/266 [==============================] - 1s 5ms/step - loss: 4.6457 - accuracy: 0.3423 - val_loss: 4.7189 - val_accuracy: 0.3371
Epoch 167/200
266/266 [==============================] - 1s 5ms/step - loss: 4.6477 - accuracy: 0.3426 - val_loss: 4.7218 - val_accuracy: 0.3371
Epoch 168/200
266/266 [==============================] - 1s 5ms/step - loss: 4.6279 - accuracy: 0.3462 - val_loss: 4.7239 - val_accuracy: 0.3331
Epoch 169/200
266/266 [==============================] - 1s 5ms/step - loss: 4.6251 - accuracy: 0.3487 - val_loss: 4.7175 - val_accuracy: 0.3358
Epoch 170/200
266/266 [==============================] - 1s 5ms/step - loss: 4.6211 - accuracy: 0.3540 - val_loss: 4.7155 - val_accuracy: 0.3364
Epoch 171/200
266/266 [==============================] - 1s 5ms/step - loss: 4.6210 - accuracy: 0.3494 - val_loss: 4.7210 - val_accuracy: 0.3338
Epoch 172/200
266/266 [==============================] - 1s 5ms/step - loss: 4.6265 - accuracy: 0.3419 - val_loss: 4.7056 - val_accuracy: 0.3384
Epoch 173/200
266/266 [==============================] - 1s 5ms/step - loss: 4.6033 - accuracy: 0.3435 - val_loss: 4.6917 - val_accuracy: 0.3384
Epoch 174/200
266/266 [==============================] - 1s 5ms/step - loss: 4.6116 - accuracy: 0.3500 - val_loss: 4.6815 - val_accuracy: 0.3451
Epoch 175/200
266/266 [==============================] - 1s 5ms/step - loss: 4.6013 - accuracy: 0.3519 - val_loss: 4.6886 - val_accuracy: 0.3438
Epoch 176/200
266/266 [==============================] - 1s 5ms/step - loss: 4.6167 - accuracy: 0.3327 - val_loss: 4.6847 - val_accuracy: 0.3424
Epoch 177/200
266/266 [==============================] - 1s 5ms/step - loss: 4.5946 - accuracy: 0.3553 - val_loss: 4.6879 - val_accuracy: 0.3444
Epoch 178/200
266/266 [==============================] - 1s 5ms/step - loss: 4.5860 - accuracy: 0.3570 - val_loss: 4.6643 - val_accuracy: 0.3438
Epoch 179/200
266/266 [==============================] - 1s 5ms/step - loss: 4.6122 - accuracy: 0.3430 - val_loss: 4.6517 - val_accuracy: 0.3457
Epoch 180/200
266/266 [==============================] - 1s 5ms/step - loss: 4.5771 - accuracy: 0.3534 - val_loss: 4.6736 - val_accuracy: 0.3484
Epoch 181/200
266/266 [==============================] - 1s 5ms/step - loss: 4.5630 - accuracy: 0.3534 - val_loss: 4.6735 - val_accuracy: 0.3444
Epoch 182/200
266/266 [==============================] - 1s 5ms/step - loss: 4.5799 - accuracy: 0.3538 - val_loss: 4.6483 - val_accuracy: 0.3504
Epoch 183/200
266/266 [==============================] - 1s 5ms/step - loss: 4.5798 - accuracy: 0.3489 - val_loss: 4.6268 - val_accuracy: 0.3557
Epoch 184/200
266/266 [==============================] - 1s 5ms/step - loss: 4.5625 - accuracy: 0.3486 - val_loss: 4.6282 - val_accuracy: 0.3531
Epoch 185/200
266/266 [==============================] - 1s 5ms/step - loss: 4.5510 - accuracy: 0.3593 - val_loss: 4.6446 - val_accuracy: 0.3504
Epoch 186/200
266/266 [==============================] - 1s 5ms/step - loss: 4.5747 - accuracy: 0.3497 - val_loss: 4.6474 - val_accuracy: 0.3484
Epoch 187/200
266/266 [==============================] - 1s 5ms/step - loss: 4.5576 - accuracy: 0.3512 - val_loss: 4.6239 - val_accuracy: 0.3504
Epoch 188/200
266/266 [==============================] - 1s 5ms/step - loss: 4.5408 - accuracy: 0.3545 - val_loss: 4.6446 - val_accuracy: 0.3471
Epoch 189/200
266/266 [==============================] - 1s 5ms/step - loss: 4.5158 - accuracy: 0.3663 - val_loss: 4.6142 - val_accuracy: 0.3537
Epoch 190/200
266/266 [==============================] - 1s 5ms/step - loss: 4.5589 - accuracy: 0.3565 - val_loss: 4.5915 - val_accuracy: 0.3610
Epoch 191/200
266/266 [==============================] - 1s 5ms/step - loss: 4.5685 - accuracy: 0.3423 - val_loss: 4.6220 - val_accuracy: 0.3511
Epoch 192/200
266/266 [==============================] - 1s 5ms/step - loss: 4.5252 - accuracy: 0.3645 - val_loss: 4.6177 - val_accuracy: 0.3524
Epoch 193/200
266/266 [==============================] - 1s 5ms/step - loss: 4.5281 - accuracy: 0.3623 - val_loss: 4.5909 - val_accuracy: 0.3597
Epoch 194/200
266/266 [==============================] - 1s 5ms/step - loss: 4.5260 - accuracy: 0.3640 - val_loss: 4.5939 - val_accuracy: 0.3570
Epoch 195/200
266/266 [==============================] - 1s 5ms/step - loss: 4.5082 - accuracy: 0.3716 - val_loss: 4.6032 - val_accuracy: 0.3557
Epoch 196/200
266/266 [==============================] - 1s 5ms/step - loss: 4.5072 - accuracy: 0.3621 - val_loss: 4.5888 - val_accuracy: 0.3624
Epoch 197/200
266/266 [==============================] - 1s 5ms/step - loss: 4.5118 - accuracy: 0.3574 - val_loss: 4.5914 - val_accuracy: 0.3531
Epoch 198/200
266/266 [==============================] - 1s 5ms/step - loss: 4.4944 - accuracy: 0.3666 - val_loss: 4.6155 - val_accuracy: 0.3484
Epoch 199/200
266/266 [==============================] - 1s 5ms/step - loss: 4.4695 - accuracy: 0.3619 - val_loss: 4.5682 - val_accuracy: 0.3637
Epoch 200/200
266/266 [==============================] - 1s 5ms/step - loss: 4.4748 - accuracy: 0.3719 - val_loss: 4.5729 - val_accuracy: 0.3557
In [ ]:
_, accuracy = model_report(CNN2_MODEL_OPTIMIZED, CNN2_MODEL_OPTIMIZED_history)
accuracies_opt_SGD["CNN2"] = accuracy
Test set evaluation metrics
---------------------------
Loss:     4.574
Accuracy: 34.871%

Μεταφορά μάθησης

VGG16
In [ ]:
VGG16_MODEL_OPTIMIZED = init_VGG16_model_optimized(True, optimizer = tf.optimizers.SGD)
VGG16_MODEL_OPTIMIZED_history = train_model(VGG16_MODEL_OPTIMIZED, epochs = 200, callbacks = [callback])
Model: "sequential_23"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
vgg16 (Functional)           (None, 1, 1, 512)         14714688  
_________________________________________________________________
dropout_38 (Dropout)         (None, 1, 1, 512)         0         
_________________________________________________________________
global_average_pooling2d_14  (None, 512)               0         
_________________________________________________________________
dense_33 (Dense)             (None, 20)                10260     
=================================================================
Total params: 14,724,948
Trainable params: 14,724,948
Non-trainable params: 0
_________________________________________________________________
Epoch 1/200
266/266 [==============================] - 8s 29ms/step - loss: 3.4273 - accuracy: 0.0583 - val_loss: 2.8791 - val_accuracy: 0.1376
Epoch 2/200
266/266 [==============================] - 8s 29ms/step - loss: 2.9929 - accuracy: 0.1078 - val_loss: 2.6428 - val_accuracy: 0.2320
Epoch 3/200
266/266 [==============================] - 8s 29ms/step - loss: 2.7735 - accuracy: 0.1550 - val_loss: 2.4156 - val_accuracy: 0.3484
Epoch 4/200
266/266 [==============================] - 8s 29ms/step - loss: 2.5554 - accuracy: 0.2208 - val_loss: 2.2049 - val_accuracy: 0.3956
Epoch 5/200
266/266 [==============================] - 8s 29ms/step - loss: 2.3734 - accuracy: 0.2798 - val_loss: 2.0303 - val_accuracy: 0.4355
Epoch 6/200
266/266 [==============================] - 8s 29ms/step - loss: 2.2223 - accuracy: 0.3179 - val_loss: 1.8979 - val_accuracy: 0.4561
Epoch 7/200
266/266 [==============================] - 8s 29ms/step - loss: 2.0989 - accuracy: 0.3592 - val_loss: 1.7755 - val_accuracy: 0.4914
Epoch 8/200
266/266 [==============================] - 8s 29ms/step - loss: 2.0080 - accuracy: 0.3872 - val_loss: 1.6923 - val_accuracy: 0.5133
Epoch 9/200
266/266 [==============================] - 8s 29ms/step - loss: 1.9135 - accuracy: 0.4128 - val_loss: 1.6155 - val_accuracy: 0.5366
Epoch 10/200
266/266 [==============================] - 8s 29ms/step - loss: 1.8353 - accuracy: 0.4349 - val_loss: 1.5450 - val_accuracy: 0.5519
Epoch 11/200
266/266 [==============================] - 8s 29ms/step - loss: 1.7600 - accuracy: 0.4622 - val_loss: 1.5136 - val_accuracy: 0.5539
Epoch 12/200
266/266 [==============================] - 8s 29ms/step - loss: 1.7039 - accuracy: 0.4853 - val_loss: 1.4572 - val_accuracy: 0.5878
Epoch 13/200
266/266 [==============================] - 8s 29ms/step - loss: 1.6443 - accuracy: 0.4993 - val_loss: 1.4001 - val_accuracy: 0.5911
Epoch 14/200
266/266 [==============================] - 8s 29ms/step - loss: 1.6069 - accuracy: 0.5157 - val_loss: 1.3719 - val_accuracy: 0.6044
Epoch 15/200
266/266 [==============================] - 8s 29ms/step - loss: 1.5466 - accuracy: 0.5285 - val_loss: 1.3407 - val_accuracy: 0.6031
Epoch 16/200
266/266 [==============================] - 8s 29ms/step - loss: 1.5091 - accuracy: 0.5406 - val_loss: 1.3213 - val_accuracy: 0.6064
Epoch 17/200
266/266 [==============================] - 8s 29ms/step - loss: 1.4661 - accuracy: 0.5518 - val_loss: 1.3007 - val_accuracy: 0.6097
Epoch 18/200
266/266 [==============================] - 8s 29ms/step - loss: 1.4388 - accuracy: 0.5630 - val_loss: 1.2725 - val_accuracy: 0.6243
Epoch 19/200
266/266 [==============================] - 8s 29ms/step - loss: 1.4047 - accuracy: 0.5780 - val_loss: 1.2396 - val_accuracy: 0.6356
Epoch 20/200
266/266 [==============================] - 8s 29ms/step - loss: 1.3876 - accuracy: 0.5824 - val_loss: 1.2252 - val_accuracy: 0.6403
Epoch 21/200
266/266 [==============================] - 8s 29ms/step - loss: 1.3581 - accuracy: 0.5920 - val_loss: 1.2024 - val_accuracy: 0.6376
Epoch 22/200
266/266 [==============================] - 8s 29ms/step - loss: 1.3088 - accuracy: 0.5905 - val_loss: 1.1740 - val_accuracy: 0.6543
Epoch 23/200
266/266 [==============================] - 8s 29ms/step - loss: 1.2886 - accuracy: 0.6112 - val_loss: 1.1803 - val_accuracy: 0.6496
Epoch 24/200
266/266 [==============================] - 8s 29ms/step - loss: 1.2662 - accuracy: 0.6138 - val_loss: 1.1550 - val_accuracy: 0.6622
Epoch 25/200
266/266 [==============================] - 8s 29ms/step - loss: 1.2613 - accuracy: 0.6182 - val_loss: 1.1311 - val_accuracy: 0.6662
Epoch 26/200
266/266 [==============================] - 8s 29ms/step - loss: 1.2008 - accuracy: 0.6298 - val_loss: 1.1496 - val_accuracy: 0.6616
Epoch 27/200
266/266 [==============================] - 8s 29ms/step - loss: 1.2175 - accuracy: 0.6273 - val_loss: 1.1116 - val_accuracy: 0.6649
Epoch 28/200
266/266 [==============================] - 8s 29ms/step - loss: 1.1787 - accuracy: 0.6407 - val_loss: 1.1015 - val_accuracy: 0.6649
Epoch 29/200
266/266 [==============================] - 8s 29ms/step - loss: 1.1646 - accuracy: 0.6387 - val_loss: 1.0769 - val_accuracy: 0.6795
Epoch 30/200
266/266 [==============================] - 8s 29ms/step - loss: 1.1730 - accuracy: 0.6494 - val_loss: 1.0782 - val_accuracy: 0.6762
Epoch 31/200
266/266 [==============================] - 8s 29ms/step - loss: 1.1437 - accuracy: 0.6482 - val_loss: 1.0684 - val_accuracy: 0.6735
Epoch 32/200
266/266 [==============================] - 8s 29ms/step - loss: 1.1215 - accuracy: 0.6512 - val_loss: 1.0614 - val_accuracy: 0.6749
Epoch 33/200
266/266 [==============================] - 8s 29ms/step - loss: 1.0678 - accuracy: 0.6748 - val_loss: 1.0527 - val_accuracy: 0.6842
Epoch 34/200
266/266 [==============================] - 8s 29ms/step - loss: 1.0624 - accuracy: 0.6711 - val_loss: 1.0476 - val_accuracy: 0.6862
Epoch 35/200
266/266 [==============================] - 8s 29ms/step - loss: 1.0636 - accuracy: 0.6723 - val_loss: 1.0353 - val_accuracy: 0.6895
Epoch 36/200
266/266 [==============================] - 8s 29ms/step - loss: 1.0788 - accuracy: 0.6650 - val_loss: 1.0313 - val_accuracy: 0.6915
Epoch 37/200
266/266 [==============================] - 8s 29ms/step - loss: 1.0291 - accuracy: 0.6919 - val_loss: 1.0260 - val_accuracy: 0.6868
Epoch 38/200
266/266 [==============================] - 8s 29ms/step - loss: 1.0025 - accuracy: 0.6985 - val_loss: 1.0052 - val_accuracy: 0.6995
Epoch 39/200
266/266 [==============================] - 8s 29ms/step - loss: 0.9885 - accuracy: 0.6977 - val_loss: 1.0069 - val_accuracy: 0.6915
Epoch 40/200
266/266 [==============================] - 8s 29ms/step - loss: 0.9941 - accuracy: 0.6926 - val_loss: 1.0075 - val_accuracy: 0.6915
Epoch 41/200
266/266 [==============================] - 8s 29ms/step - loss: 0.9757 - accuracy: 0.6930 - val_loss: 0.9956 - val_accuracy: 0.6968
Epoch 42/200
266/266 [==============================] - 8s 29ms/step - loss: 0.9685 - accuracy: 0.6987 - val_loss: 0.9833 - val_accuracy: 0.7035
Epoch 43/200
266/266 [==============================] - 8s 29ms/step - loss: 0.9542 - accuracy: 0.7037 - val_loss: 0.9895 - val_accuracy: 0.7015
Epoch 44/200
266/266 [==============================] - 8s 29ms/step - loss: 0.9513 - accuracy: 0.7077 - val_loss: 0.9902 - val_accuracy: 0.7008
Epoch 45/200
266/266 [==============================] - 8s 29ms/step - loss: 0.9560 - accuracy: 0.7033 - val_loss: 0.9833 - val_accuracy: 0.6948
Epoch 46/200
266/266 [==============================] - 8s 29ms/step - loss: 0.9288 - accuracy: 0.7188 - val_loss: 0.9755 - val_accuracy: 0.6981
Epoch 47/200
266/266 [==============================] - 8s 29ms/step - loss: 0.8979 - accuracy: 0.7228 - val_loss: 0.9604 - val_accuracy: 0.7088
Epoch 48/200
266/266 [==============================] - 8s 29ms/step - loss: 0.9074 - accuracy: 0.7151 - val_loss: 0.9578 - val_accuracy: 0.7088
Epoch 49/200
266/266 [==============================] - 8s 29ms/step - loss: 0.9016 - accuracy: 0.7217 - val_loss: 0.9589 - val_accuracy: 0.7041
Epoch 50/200
266/266 [==============================] - 8s 29ms/step - loss: 0.9065 - accuracy: 0.7170 - val_loss: 0.9579 - val_accuracy: 0.7101
Epoch 51/200
266/266 [==============================] - 8s 29ms/step - loss: 0.8650 - accuracy: 0.7337 - val_loss: 0.9583 - val_accuracy: 0.7061
Epoch 52/200
266/266 [==============================] - 8s 29ms/step - loss: 0.8624 - accuracy: 0.7330 - val_loss: 0.9524 - val_accuracy: 0.7108
Epoch 53/200
266/266 [==============================] - 8s 29ms/step - loss: 0.8432 - accuracy: 0.7391 - val_loss: 0.9529 - val_accuracy: 0.7114
Epoch 54/200
266/266 [==============================] - 8s 29ms/step - loss: 0.8309 - accuracy: 0.7403 - val_loss: 0.9464 - val_accuracy: 0.7055
Epoch 55/200
266/266 [==============================] - 8s 29ms/step - loss: 0.8338 - accuracy: 0.7445 - val_loss: 0.9319 - val_accuracy: 0.7181
Epoch 56/200
266/266 [==============================] - 8s 29ms/step - loss: 0.8491 - accuracy: 0.7272 - val_loss: 0.9395 - val_accuracy: 0.7088
Epoch 57/200
266/266 [==============================] - 8s 29ms/step - loss: 0.8171 - accuracy: 0.7399 - val_loss: 0.9346 - val_accuracy: 0.7108
Epoch 58/200
266/266 [==============================] - 8s 29ms/step - loss: 0.7903 - accuracy: 0.7514 - val_loss: 0.9311 - val_accuracy: 0.7154
Epoch 59/200
266/266 [==============================] - 8s 29ms/step - loss: 0.8121 - accuracy: 0.7434 - val_loss: 0.9281 - val_accuracy: 0.7194
Epoch 60/200
266/266 [==============================] - 8s 29ms/step - loss: 0.8018 - accuracy: 0.7550 - val_loss: 0.9272 - val_accuracy: 0.7141
Epoch 61/200
266/266 [==============================] - 8s 29ms/step - loss: 0.7740 - accuracy: 0.7621 - val_loss: 0.9204 - val_accuracy: 0.7134
Epoch 62/200
266/266 [==============================] - 8s 29ms/step - loss: 0.7801 - accuracy: 0.7564 - val_loss: 0.9078 - val_accuracy: 0.7174
Epoch 63/200
266/266 [==============================] - 8s 29ms/step - loss: 0.7805 - accuracy: 0.7601 - val_loss: 0.9227 - val_accuracy: 0.7214
Epoch 64/200
266/266 [==============================] - 8s 29ms/step - loss: 0.7613 - accuracy: 0.7619 - val_loss: 0.9029 - val_accuracy: 0.7148
Epoch 65/200
266/266 [==============================] - 8s 29ms/step - loss: 0.7563 - accuracy: 0.7629 - val_loss: 0.9060 - val_accuracy: 0.7294
Epoch 66/200
266/266 [==============================] - 8s 29ms/step - loss: 0.7425 - accuracy: 0.7701 - val_loss: 0.9238 - val_accuracy: 0.7194
Epoch 67/200
266/266 [==============================] - 8s 29ms/step - loss: 0.7319 - accuracy: 0.7748 - val_loss: 0.9013 - val_accuracy: 0.7234
Epoch 68/200
266/266 [==============================] - 8s 29ms/step - loss: 0.7248 - accuracy: 0.7728 - val_loss: 0.8980 - val_accuracy: 0.7314
Epoch 69/200
266/266 [==============================] - 8s 29ms/step - loss: 0.7183 - accuracy: 0.7793 - val_loss: 0.9039 - val_accuracy: 0.7207
Epoch 70/200
266/266 [==============================] - 8s 29ms/step - loss: 0.7211 - accuracy: 0.7768 - val_loss: 0.8882 - val_accuracy: 0.7320
Epoch 71/200
266/266 [==============================] - 8s 29ms/step - loss: 0.7037 - accuracy: 0.7833 - val_loss: 0.8973 - val_accuracy: 0.7241
Epoch 72/200
266/266 [==============================] - 8s 29ms/step - loss: 0.7053 - accuracy: 0.7802 - val_loss: 0.8942 - val_accuracy: 0.7281
Epoch 73/200
266/266 [==============================] - 8s 29ms/step - loss: 0.7072 - accuracy: 0.7773 - val_loss: 0.8890 - val_accuracy: 0.7334
Epoch 74/200
266/266 [==============================] - 8s 29ms/step - loss: 0.6799 - accuracy: 0.7869 - val_loss: 0.9245 - val_accuracy: 0.7188
Epoch 75/200
266/266 [==============================] - 8s 29ms/step - loss: 0.6933 - accuracy: 0.7799 - val_loss: 0.9036 - val_accuracy: 0.7281
Epoch 76/200
266/266 [==============================] - 8s 29ms/step - loss: 0.6856 - accuracy: 0.7874 - val_loss: 0.8906 - val_accuracy: 0.7254
Epoch 77/200
266/266 [==============================] - 8s 29ms/step - loss: 0.6631 - accuracy: 0.7912 - val_loss: 0.8992 - val_accuracy: 0.7354
Epoch 78/200
266/266 [==============================] - 8s 29ms/step - loss: 0.6684 - accuracy: 0.7954 - val_loss: 0.9013 - val_accuracy: 0.7314
Epoch 79/200
266/266 [==============================] - 8s 29ms/step - loss: 0.6418 - accuracy: 0.7980 - val_loss: 0.8988 - val_accuracy: 0.7374
Epoch 80/200
266/266 [==============================] - 8s 29ms/step - loss: 0.6462 - accuracy: 0.7999 - val_loss: 0.8885 - val_accuracy: 0.7307
Epoch 81/200
266/266 [==============================] - 8s 29ms/step - loss: 0.6565 - accuracy: 0.7958 - val_loss: 0.8761 - val_accuracy: 0.7360
Epoch 82/200
266/266 [==============================] - 8s 29ms/step - loss: 0.6502 - accuracy: 0.7952 - val_loss: 0.8799 - val_accuracy: 0.7380
Epoch 83/200
266/266 [==============================] - 8s 29ms/step - loss: 0.6327 - accuracy: 0.8020 - val_loss: 0.8905 - val_accuracy: 0.7374
Epoch 84/200
266/266 [==============================] - 8s 29ms/step - loss: 0.6284 - accuracy: 0.8051 - val_loss: 0.8905 - val_accuracy: 0.7400
Epoch 85/200
266/266 [==============================] - 8s 29ms/step - loss: 0.6093 - accuracy: 0.8086 - val_loss: 0.8743 - val_accuracy: 0.7320
Epoch 86/200
266/266 [==============================] - 8s 29ms/step - loss: 0.6266 - accuracy: 0.8007 - val_loss: 0.8819 - val_accuracy: 0.7360
Epoch 87/200
266/266 [==============================] - 8s 29ms/step - loss: 0.6045 - accuracy: 0.8097 - val_loss: 0.8913 - val_accuracy: 0.7340
Epoch 88/200
266/266 [==============================] - 8s 29ms/step - loss: 0.5783 - accuracy: 0.8138 - val_loss: 0.8732 - val_accuracy: 0.7334
Epoch 89/200
266/266 [==============================] - 8s 29ms/step - loss: 0.5874 - accuracy: 0.8198 - val_loss: 0.8788 - val_accuracy: 0.7334
Epoch 90/200
266/266 [==============================] - 8s 29ms/step - loss: 0.5654 - accuracy: 0.8229 - val_loss: 0.8664 - val_accuracy: 0.7334
Epoch 91/200
266/266 [==============================] - 8s 29ms/step - loss: 0.5798 - accuracy: 0.8136 - val_loss: 0.8806 - val_accuracy: 0.7427
Epoch 92/200
266/266 [==============================] - 8s 29ms/step - loss: 0.5617 - accuracy: 0.8224 - val_loss: 0.8671 - val_accuracy: 0.7387
Epoch 93/200
266/266 [==============================] - 8s 29ms/step - loss: 0.5719 - accuracy: 0.8225 - val_loss: 0.8703 - val_accuracy: 0.7427
Epoch 94/200
266/266 [==============================] - 8s 29ms/step - loss: 0.5670 - accuracy: 0.8186 - val_loss: 0.8929 - val_accuracy: 0.7394
Epoch 95/200
266/266 [==============================] - 8s 29ms/step - loss: 0.5555 - accuracy: 0.8273 - val_loss: 0.8845 - val_accuracy: 0.7387
Epoch 96/200
266/266 [==============================] - 8s 29ms/step - loss: 0.5654 - accuracy: 0.8281 - val_loss: 0.8829 - val_accuracy: 0.7420
Epoch 97/200
266/266 [==============================] - 8s 29ms/step - loss: 0.5370 - accuracy: 0.8295 - val_loss: 0.8891 - val_accuracy: 0.7367
Epoch 98/200
266/266 [==============================] - 8s 29ms/step - loss: 0.5529 - accuracy: 0.8254 - val_loss: 0.8594 - val_accuracy: 0.7427
Epoch 99/200
266/266 [==============================] - 8s 29ms/step - loss: 0.5502 - accuracy: 0.8187 - val_loss: 0.8638 - val_accuracy: 0.7414
Epoch 100/200
266/266 [==============================] - 8s 29ms/step - loss: 0.5333 - accuracy: 0.8317 - val_loss: 0.8935 - val_accuracy: 0.7334
Epoch 101/200
266/266 [==============================] - 8s 29ms/step - loss: 0.5326 - accuracy: 0.8308 - val_loss: 0.8938 - val_accuracy: 0.7434
Epoch 102/200
266/266 [==============================] - 8s 29ms/step - loss: 0.5280 - accuracy: 0.8327 - val_loss: 0.8648 - val_accuracy: 0.7440
Epoch 103/200
266/266 [==============================] - 8s 29ms/step - loss: 0.5131 - accuracy: 0.8351 - val_loss: 0.8642 - val_accuracy: 0.7427
Epoch 104/200
266/266 [==============================] - 8s 29ms/step - loss: 0.5252 - accuracy: 0.8363 - val_loss: 0.8800 - val_accuracy: 0.7407
Epoch 105/200
266/266 [==============================] - 8s 29ms/step - loss: 0.5125 - accuracy: 0.8334 - val_loss: 0.8708 - val_accuracy: 0.7440
Epoch 106/200
266/266 [==============================] - 8s 29ms/step - loss: 0.5050 - accuracy: 0.8437 - val_loss: 0.8816 - val_accuracy: 0.7414
Epoch 107/200
266/266 [==============================] - 8s 29ms/step - loss: 0.5195 - accuracy: 0.8381 - val_loss: 0.8704 - val_accuracy: 0.7493
Epoch 108/200
266/266 [==============================] - 8s 29ms/step - loss: 0.4984 - accuracy: 0.8461 - val_loss: 0.8672 - val_accuracy: 0.7447
Epoch 109/200
266/266 [==============================] - 8s 29ms/step - loss: 0.4783 - accuracy: 0.8494 - val_loss: 0.8685 - val_accuracy: 0.7467
Epoch 110/200
266/266 [==============================] - 8s 29ms/step - loss: 0.5101 - accuracy: 0.8440 - val_loss: 0.8844 - val_accuracy: 0.7440
Epoch 111/200
266/266 [==============================] - 8s 29ms/step - loss: 0.4852 - accuracy: 0.8461 - val_loss: 0.8852 - val_accuracy: 0.7460
Epoch 112/200
266/266 [==============================] - 8s 29ms/step - loss: 0.4870 - accuracy: 0.8415 - val_loss: 0.9004 - val_accuracy: 0.7460
Epoch 113/200
266/266 [==============================] - 8s 29ms/step - loss: 0.4732 - accuracy: 0.8475 - val_loss: 0.8709 - val_accuracy: 0.7467
Epoch 114/200
266/266 [==============================] - 8s 29ms/step - loss: 0.4934 - accuracy: 0.8416 - val_loss: 0.8838 - val_accuracy: 0.7347
Epoch 115/200
266/266 [==============================] - 8s 29ms/step - loss: 0.4541 - accuracy: 0.8526 - val_loss: 0.8651 - val_accuracy: 0.7493
Epoch 116/200
266/266 [==============================] - 8s 29ms/step - loss: 0.4645 - accuracy: 0.8518 - val_loss: 0.8561 - val_accuracy: 0.7434
Epoch 117/200
266/266 [==============================] - 8s 29ms/step - loss: 0.4679 - accuracy: 0.8484 - val_loss: 0.8830 - val_accuracy: 0.7387
Epoch 118/200
266/266 [==============================] - 8s 29ms/step - loss: 0.4304 - accuracy: 0.8695 - val_loss: 0.8689 - val_accuracy: 0.7487
Epoch 119/200
266/266 [==============================] - 8s 29ms/step - loss: 0.4728 - accuracy: 0.8508 - val_loss: 0.8739 - val_accuracy: 0.7453
Epoch 120/200
266/266 [==============================] - 8s 29ms/step - loss: 0.4419 - accuracy: 0.8609 - val_loss: 0.8703 - val_accuracy: 0.7440
Epoch 121/200
266/266 [==============================] - 8s 29ms/step - loss: 0.4210 - accuracy: 0.8657 - val_loss: 0.8880 - val_accuracy: 0.7507
Epoch 122/200
266/266 [==============================] - 8s 29ms/step - loss: 0.4325 - accuracy: 0.8668 - val_loss: 0.8790 - val_accuracy: 0.7453
Epoch 123/200
266/266 [==============================] - 8s 29ms/step - loss: 0.4261 - accuracy: 0.8601 - val_loss: 0.8785 - val_accuracy: 0.7440
Epoch 124/200
266/266 [==============================] - 8s 29ms/step - loss: 0.4415 - accuracy: 0.8585 - val_loss: 0.8834 - val_accuracy: 0.7434
Epoch 125/200
266/266 [==============================] - 8s 29ms/step - loss: 0.4233 - accuracy: 0.8642 - val_loss: 0.8847 - val_accuracy: 0.7467
Epoch 126/200
266/266 [==============================] - 8s 29ms/step - loss: 0.4027 - accuracy: 0.8726 - val_loss: 0.8765 - val_accuracy: 0.7480
Epoch 127/200
266/266 [==============================] - 8s 29ms/step - loss: 0.4184 - accuracy: 0.8659 - val_loss: 0.9108 - val_accuracy: 0.7354
Epoch 128/200
266/266 [==============================] - 8s 29ms/step - loss: 0.3991 - accuracy: 0.8726 - val_loss: 0.8739 - val_accuracy: 0.7527
Epoch 129/200
266/266 [==============================] - 8s 29ms/step - loss: 0.4008 - accuracy: 0.8717 - val_loss: 0.8845 - val_accuracy: 0.7493
Epoch 130/200
266/266 [==============================] - 8s 29ms/step - loss: 0.4091 - accuracy: 0.8703 - val_loss: 0.8864 - val_accuracy: 0.7473
Epoch 131/200
266/266 [==============================] - 8s 29ms/step - loss: 0.3935 - accuracy: 0.8752 - val_loss: 0.8764 - val_accuracy: 0.7573
Epoch 132/200
266/266 [==============================] - 8s 29ms/step - loss: 0.3849 - accuracy: 0.8695 - val_loss: 0.8872 - val_accuracy: 0.7560
Epoch 133/200
266/266 [==============================] - 8s 29ms/step - loss: 0.4009 - accuracy: 0.8732 - val_loss: 0.8965 - val_accuracy: 0.7447
Epoch 134/200
266/266 [==============================] - 8s 29ms/step - loss: 0.4027 - accuracy: 0.8675 - val_loss: 0.8877 - val_accuracy: 0.7500
Epoch 135/200
266/266 [==============================] - 8s 29ms/step - loss: 0.3789 - accuracy: 0.8898 - val_loss: 0.8830 - val_accuracy: 0.7513
Epoch 136/200
266/266 [==============================] - 8s 29ms/step - loss: 0.3827 - accuracy: 0.8799 - val_loss: 0.8911 - val_accuracy: 0.7500
In [ ]:
_, accuracy = model_report(VGG16_MODEL_OPTIMIZED, VGG16_MODEL_OPTIMIZED_history)
accuracies_opt_SGD["VGG_ALL"] = accuracy
Test set evaluation metrics
---------------------------
Loss:     0.840
Accuracy: 75.248%
MobileNet
In [ ]:
MobileNetV2_MODEL_OPTIMIZED = init_MobileNetV2_model_optimized(True, optimizer = tf.optimizers.SGD)
MobileNetV2_MODEL_OPTIMIZED_history = train_model(MobileNetV2_MODEL_OPTIMIZED, train_dataset = train_ds_res, validation_dataset = validation_ds_res, epochs = 200, callbacks=[callback])
Model: "sequential_2"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
mobilenetv2_1.00_224 (Functi (None, 7, 7, 1280)        2257984   
_________________________________________________________________
dropout_2 (Dropout)          (None, 7, 7, 1280)        0         
_________________________________________________________________
global_average_pooling2d_2 ( (None, 1280)              0         
_________________________________________________________________
dense_2 (Dense)              (None, 20)                25620     
=================================================================
Total params: 2,283,604
Trainable params: 2,249,492
Non-trainable params: 34,112
_________________________________________________________________
Epoch 1/200
266/266 [==============================] - 69s 222ms/step - loss: 3.3084 - accuracy: 0.0496 - val_loss: 3.3908 - val_accuracy: 0.0439
Epoch 2/200
266/266 [==============================] - 59s 222ms/step - loss: 3.0743 - accuracy: 0.0825 - val_loss: 3.2276 - val_accuracy: 0.0492
Epoch 3/200
266/266 [==============================] - 59s 223ms/step - loss: 2.8464 - accuracy: 0.1413 - val_loss: 3.1345 - val_accuracy: 0.0572
Epoch 4/200
266/266 [==============================] - 59s 223ms/step - loss: 2.6819 - accuracy: 0.2011 - val_loss: 3.1003 - val_accuracy: 0.0665
Epoch 5/200
266/266 [==============================] - 60s 227ms/step - loss: 2.5174 - accuracy: 0.2653 - val_loss: 3.0899 - val_accuracy: 0.0798
Epoch 6/200
266/266 [==============================] - 61s 229ms/step - loss: 2.3556 - accuracy: 0.3252 - val_loss: 3.0648 - val_accuracy: 0.0924
Epoch 7/200
266/266 [==============================] - 61s 229ms/step - loss: 2.2258 - accuracy: 0.3826 - val_loss: 2.9843 - val_accuracy: 0.1137
Epoch 8/200
266/266 [==============================] - 61s 228ms/step - loss: 2.0770 - accuracy: 0.4408 - val_loss: 2.8442 - val_accuracy: 0.1722
Epoch 9/200
266/266 [==============================] - 61s 229ms/step - loss: 1.9570 - accuracy: 0.4918 - val_loss: 2.6991 - val_accuracy: 0.2108
Epoch 10/200
266/266 [==============================] - 61s 229ms/step - loss: 1.8035 - accuracy: 0.5428 - val_loss: 2.4493 - val_accuracy: 0.2859
Epoch 11/200
266/266 [==============================] - 61s 228ms/step - loss: 1.7479 - accuracy: 0.5457 - val_loss: 2.2160 - val_accuracy: 0.3690
Epoch 12/200
266/266 [==============================] - 60s 225ms/step - loss: 1.6461 - accuracy: 0.5858 - val_loss: 1.9563 - val_accuracy: 0.4628
Epoch 13/200
266/266 [==============================] - 61s 229ms/step - loss: 1.5537 - accuracy: 0.6061 - val_loss: 1.7974 - val_accuracy: 0.5133
Epoch 14/200
266/266 [==============================] - 61s 230ms/step - loss: 1.4437 - accuracy: 0.6441 - val_loss: 1.6531 - val_accuracy: 0.5545
Epoch 15/200
266/266 [==============================] - 61s 229ms/step - loss: 1.3911 - accuracy: 0.6448 - val_loss: 1.5697 - val_accuracy: 0.5698
Epoch 16/200
266/266 [==============================] - 61s 229ms/step - loss: 1.3084 - accuracy: 0.6823 - val_loss: 1.4789 - val_accuracy: 0.6037
Epoch 17/200
266/266 [==============================] - 61s 228ms/step - loss: 1.2694 - accuracy: 0.6788 - val_loss: 1.3359 - val_accuracy: 0.6529
Epoch 18/200
266/266 [==============================] - 61s 228ms/step - loss: 1.2174 - accuracy: 0.6913 - val_loss: 1.2604 - val_accuracy: 0.6742
Epoch 19/200
266/266 [==============================] - 61s 229ms/step - loss: 1.1402 - accuracy: 0.7169 - val_loss: 1.1855 - val_accuracy: 0.6961
Epoch 20/200
266/266 [==============================] - 61s 229ms/step - loss: 1.1137 - accuracy: 0.7149 - val_loss: 1.1472 - val_accuracy: 0.7015
Epoch 21/200
266/266 [==============================] - 61s 229ms/step - loss: 1.0760 - accuracy: 0.7205 - val_loss: 1.0823 - val_accuracy: 0.7241
Epoch 22/200
266/266 [==============================] - 61s 228ms/step - loss: 1.0102 - accuracy: 0.7391 - val_loss: 1.0546 - val_accuracy: 0.7168
Epoch 23/200
266/266 [==============================] - 61s 228ms/step - loss: 0.9694 - accuracy: 0.7484 - val_loss: 0.9808 - val_accuracy: 0.7440
Epoch 24/200
266/266 [==============================] - 61s 228ms/step - loss: 0.9600 - accuracy: 0.7567 - val_loss: 0.9912 - val_accuracy: 0.7394
Epoch 25/200
266/266 [==============================] - 61s 228ms/step - loss: 0.9321 - accuracy: 0.7542 - val_loss: 0.9014 - val_accuracy: 0.7606
Epoch 26/200
266/266 [==============================] - 60s 227ms/step - loss: 0.9048 - accuracy: 0.7569 - val_loss: 0.9182 - val_accuracy: 0.7527
Epoch 27/200
266/266 [==============================] - 60s 225ms/step - loss: 0.8637 - accuracy: 0.7703 - val_loss: 0.8615 - val_accuracy: 0.7726
Epoch 28/200
266/266 [==============================] - 61s 228ms/step - loss: 0.8365 - accuracy: 0.7779 - val_loss: 0.8585 - val_accuracy: 0.7699
Epoch 29/200
266/266 [==============================] - 61s 229ms/step - loss: 0.8330 - accuracy: 0.7712 - val_loss: 0.8738 - val_accuracy: 0.7620
Epoch 30/200
266/266 [==============================] - 61s 228ms/step - loss: 0.7909 - accuracy: 0.7893 - val_loss: 0.8178 - val_accuracy: 0.7819
Epoch 31/200
266/266 [==============================] - 59s 222ms/step - loss: 0.7876 - accuracy: 0.7845 - val_loss: 0.8224 - val_accuracy: 0.7653
Epoch 32/200
266/266 [==============================] - 61s 228ms/step - loss: 0.7456 - accuracy: 0.7976 - val_loss: 0.7465 - val_accuracy: 0.7965
Epoch 33/200
266/266 [==============================] - 60s 227ms/step - loss: 0.7509 - accuracy: 0.8029 - val_loss: 0.7330 - val_accuracy: 0.7965
Epoch 34/200
266/266 [==============================] - 61s 229ms/step - loss: 0.7415 - accuracy: 0.7933 - val_loss: 0.7316 - val_accuracy: 0.7959
Epoch 35/200
266/266 [==============================] - 61s 228ms/step - loss: 0.7172 - accuracy: 0.8012 - val_loss: 0.7174 - val_accuracy: 0.8092
Epoch 36/200
266/266 [==============================] - 61s 228ms/step - loss: 0.6707 - accuracy: 0.8211 - val_loss: 0.7239 - val_accuracy: 0.7926
Epoch 37/200
266/266 [==============================] - 61s 228ms/step - loss: 0.6684 - accuracy: 0.8152 - val_loss: 0.6891 - val_accuracy: 0.8019
Epoch 38/200
266/266 [==============================] - 61s 228ms/step - loss: 0.6651 - accuracy: 0.8203 - val_loss: 0.6753 - val_accuracy: 0.8112
Epoch 39/200
266/266 [==============================] - 61s 228ms/step - loss: 0.6680 - accuracy: 0.8164 - val_loss: 0.6674 - val_accuracy: 0.8125
Epoch 40/200
266/266 [==============================] - 60s 227ms/step - loss: 0.6272 - accuracy: 0.8252 - val_loss: 0.6473 - val_accuracy: 0.8145
Epoch 41/200
266/266 [==============================] - 61s 228ms/step - loss: 0.6077 - accuracy: 0.8314 - val_loss: 0.6377 - val_accuracy: 0.8165
Epoch 42/200
266/266 [==============================] - 61s 230ms/step - loss: 0.6440 - accuracy: 0.8188 - val_loss: 0.6391 - val_accuracy: 0.8198
Epoch 43/200
266/266 [==============================] - 60s 227ms/step - loss: 0.6198 - accuracy: 0.8232 - val_loss: 0.6269 - val_accuracy: 0.8185
Epoch 44/200
266/266 [==============================] - 61s 229ms/step - loss: 0.5971 - accuracy: 0.8313 - val_loss: 0.6237 - val_accuracy: 0.8205
Epoch 45/200
266/266 [==============================] - 61s 229ms/step - loss: 0.5806 - accuracy: 0.8406 - val_loss: 0.6059 - val_accuracy: 0.8324
Epoch 46/200
266/266 [==============================] - 61s 228ms/step - loss: 0.5888 - accuracy: 0.8340 - val_loss: 0.6027 - val_accuracy: 0.8258
Epoch 47/200
266/266 [==============================] - 61s 228ms/step - loss: 0.5713 - accuracy: 0.8381 - val_loss: 0.6024 - val_accuracy: 0.8238
Epoch 48/200
266/266 [==============================] - 61s 228ms/step - loss: 0.5521 - accuracy: 0.8465 - val_loss: 0.6191 - val_accuracy: 0.8165
Epoch 49/200
266/266 [==============================] - 61s 228ms/step - loss: 0.5275 - accuracy: 0.8541 - val_loss: 0.5959 - val_accuracy: 0.8298
Epoch 50/200
266/266 [==============================] - 61s 228ms/step - loss: 0.5408 - accuracy: 0.8484 - val_loss: 0.5889 - val_accuracy: 0.8298
Epoch 51/200
266/266 [==============================] - 61s 228ms/step - loss: 0.5303 - accuracy: 0.8529 - val_loss: 0.5628 - val_accuracy: 0.8418
Epoch 52/200
266/266 [==============================] - 61s 228ms/step - loss: 0.5171 - accuracy: 0.8570 - val_loss: 0.5632 - val_accuracy: 0.8444
Epoch 53/200
266/266 [==============================] - 61s 229ms/step - loss: 0.5116 - accuracy: 0.8567 - val_loss: 0.5591 - val_accuracy: 0.8324
Epoch 54/200
266/266 [==============================] - 61s 229ms/step - loss: 0.5062 - accuracy: 0.8588 - val_loss: 0.5577 - val_accuracy: 0.8404
Epoch 55/200
266/266 [==============================] - 61s 230ms/step - loss: 0.5024 - accuracy: 0.8588 - val_loss: 0.5490 - val_accuracy: 0.8438
Epoch 56/200
266/266 [==============================] - 61s 229ms/step - loss: 0.4835 - accuracy: 0.8656 - val_loss: 0.5479 - val_accuracy: 0.8398
Epoch 57/200
266/266 [==============================] - 61s 228ms/step - loss: 0.4829 - accuracy: 0.8603 - val_loss: 0.5335 - val_accuracy: 0.8511
Epoch 58/200
266/266 [==============================] - 61s 228ms/step - loss: 0.4763 - accuracy: 0.8673 - val_loss: 0.5549 - val_accuracy: 0.8371
Epoch 59/200
266/266 [==============================] - 61s 229ms/step - loss: 0.4698 - accuracy: 0.8664 - val_loss: 0.5352 - val_accuracy: 0.8471
Epoch 60/200
266/266 [==============================] - 61s 230ms/step - loss: 0.4830 - accuracy: 0.8621 - val_loss: 0.5318 - val_accuracy: 0.8464
Epoch 61/200
266/266 [==============================] - 61s 229ms/step - loss: 0.4378 - accuracy: 0.8820 - val_loss: 0.5253 - val_accuracy: 0.8484
Epoch 62/200
266/266 [==============================] - 61s 230ms/step - loss: 0.4481 - accuracy: 0.8746 - val_loss: 0.5161 - val_accuracy: 0.8551
Epoch 63/200
266/266 [==============================] - 61s 230ms/step - loss: 0.4423 - accuracy: 0.8767 - val_loss: 0.5158 - val_accuracy: 0.8537
Epoch 64/200
266/266 [==============================] - 61s 229ms/step - loss: 0.4254 - accuracy: 0.8810 - val_loss: 0.5060 - val_accuracy: 0.8544
Epoch 65/200
266/266 [==============================] - 61s 229ms/step - loss: 0.4320 - accuracy: 0.8788 - val_loss: 0.5016 - val_accuracy: 0.8570
Epoch 66/200
266/266 [==============================] - 61s 229ms/step - loss: 0.4265 - accuracy: 0.8789 - val_loss: 0.5382 - val_accuracy: 0.8424
Epoch 67/200
266/266 [==============================] - 61s 229ms/step - loss: 0.4241 - accuracy: 0.8802 - val_loss: 0.5011 - val_accuracy: 0.8590
Epoch 68/200
266/266 [==============================] - 61s 230ms/step - loss: 0.3981 - accuracy: 0.8886 - val_loss: 0.5082 - val_accuracy: 0.8504
Epoch 69/200
266/266 [==============================] - 61s 228ms/step - loss: 0.4182 - accuracy: 0.8823 - val_loss: 0.5228 - val_accuracy: 0.8438
Epoch 70/200
266/266 [==============================] - 61s 229ms/step - loss: 0.3982 - accuracy: 0.8894 - val_loss: 0.5027 - val_accuracy: 0.8524
Epoch 71/200
266/266 [==============================] - 61s 228ms/step - loss: 0.4017 - accuracy: 0.8887 - val_loss: 0.4955 - val_accuracy: 0.8557
Epoch 72/200
266/266 [==============================] - 61s 230ms/step - loss: 0.3712 - accuracy: 0.9011 - val_loss: 0.4825 - val_accuracy: 0.8577
Epoch 73/200
266/266 [==============================] - 61s 229ms/step - loss: 0.3761 - accuracy: 0.8975 - val_loss: 0.4790 - val_accuracy: 0.8610
Epoch 74/200
266/266 [==============================] - 61s 229ms/step - loss: 0.3737 - accuracy: 0.8967 - val_loss: 0.4868 - val_accuracy: 0.8517
Epoch 75/200
266/266 [==============================] - 61s 229ms/step - loss: 0.3764 - accuracy: 0.8971 - val_loss: 0.4817 - val_accuracy: 0.8604
Epoch 76/200
266/266 [==============================] - 61s 228ms/step - loss: 0.3676 - accuracy: 0.9018 - val_loss: 0.4803 - val_accuracy: 0.8584
Epoch 77/200
266/266 [==============================] - 61s 230ms/step - loss: 0.3557 - accuracy: 0.9018 - val_loss: 0.4768 - val_accuracy: 0.8590
Epoch 78/200
266/266 [==============================] - 61s 230ms/step - loss: 0.3584 - accuracy: 0.8993 - val_loss: 0.4764 - val_accuracy: 0.8630
Epoch 79/200
266/266 [==============================] - 61s 230ms/step - loss: 0.3541 - accuracy: 0.9063 - val_loss: 0.4730 - val_accuracy: 0.8644
Epoch 80/200
266/266 [==============================] - 61s 229ms/step - loss: 0.3594 - accuracy: 0.8989 - val_loss: 0.4750 - val_accuracy: 0.8590
Epoch 81/200
266/266 [==============================] - 61s 229ms/step - loss: 0.3484 - accuracy: 0.9030 - val_loss: 0.4661 - val_accuracy: 0.8670
Epoch 82/200
266/266 [==============================] - 61s 230ms/step - loss: 0.3566 - accuracy: 0.9011 - val_loss: 0.4820 - val_accuracy: 0.8590
Epoch 83/200
266/266 [==============================] - 61s 230ms/step - loss: 0.3408 - accuracy: 0.9059 - val_loss: 0.4693 - val_accuracy: 0.8590
Epoch 84/200
266/266 [==============================] - 61s 230ms/step - loss: 0.3399 - accuracy: 0.9036 - val_loss: 0.4647 - val_accuracy: 0.8604
Epoch 85/200
266/266 [==============================] - 61s 229ms/step - loss: 0.3370 - accuracy: 0.9066 - val_loss: 0.4665 - val_accuracy: 0.8584
Epoch 86/200
266/266 [==============================] - 61s 229ms/step - loss: 0.3174 - accuracy: 0.9169 - val_loss: 0.4531 - val_accuracy: 0.8630
Epoch 87/200
266/266 [==============================] - 60s 227ms/step - loss: 0.3132 - accuracy: 0.9138 - val_loss: 0.4589 - val_accuracy: 0.8610
Epoch 88/200
266/266 [==============================] - 61s 228ms/step - loss: 0.3151 - accuracy: 0.9140 - val_loss: 0.4504 - val_accuracy: 0.8657
Epoch 89/200
266/266 [==============================] - 61s 231ms/step - loss: 0.3215 - accuracy: 0.9132 - val_loss: 0.4513 - val_accuracy: 0.8637
Epoch 90/200
266/266 [==============================] - 61s 230ms/step - loss: 0.3203 - accuracy: 0.9162 - val_loss: 0.4572 - val_accuracy: 0.8584
Epoch 91/200
266/266 [==============================] - 61s 229ms/step - loss: 0.3059 - accuracy: 0.9194 - val_loss: 0.4415 - val_accuracy: 0.8703
Epoch 92/200
266/266 [==============================] - 61s 229ms/step - loss: 0.3157 - accuracy: 0.9145 - val_loss: 0.4491 - val_accuracy: 0.8684
Epoch 93/200
266/266 [==============================] - 61s 229ms/step - loss: 0.3116 - accuracy: 0.9172 - val_loss: 0.4499 - val_accuracy: 0.8644
Epoch 94/200
266/266 [==============================] - 61s 230ms/step - loss: 0.3022 - accuracy: 0.9146 - val_loss: 0.4429 - val_accuracy: 0.8697
Epoch 95/200
266/266 [==============================] - 61s 230ms/step - loss: 0.2907 - accuracy: 0.9225 - val_loss: 0.4495 - val_accuracy: 0.8590
Epoch 96/200
266/266 [==============================] - 61s 229ms/step - loss: 0.2992 - accuracy: 0.9171 - val_loss: 0.4390 - val_accuracy: 0.8610
Epoch 97/200
266/266 [==============================] - 60s 225ms/step - loss: 0.2916 - accuracy: 0.9206 - val_loss: 0.4389 - val_accuracy: 0.8677
Epoch 98/200
266/266 [==============================] - 61s 231ms/step - loss: 0.2849 - accuracy: 0.9247 - val_loss: 0.4429 - val_accuracy: 0.8677
Epoch 99/200
266/266 [==============================] - 62s 231ms/step - loss: 0.2879 - accuracy: 0.9200 - val_loss: 0.4352 - val_accuracy: 0.8697
Epoch 100/200
266/266 [==============================] - 62s 231ms/step - loss: 0.2971 - accuracy: 0.9214 - val_loss: 0.4419 - val_accuracy: 0.8664
Epoch 101/200
266/266 [==============================] - 61s 230ms/step - loss: 0.2746 - accuracy: 0.9261 - val_loss: 0.4454 - val_accuracy: 0.8677
Epoch 102/200
266/266 [==============================] - 61s 229ms/step - loss: 0.2834 - accuracy: 0.9239 - val_loss: 0.4327 - val_accuracy: 0.8703
Epoch 103/200
266/266 [==============================] - 61s 230ms/step - loss: 0.2724 - accuracy: 0.9269 - val_loss: 0.4333 - val_accuracy: 0.8677
Epoch 104/200
266/266 [==============================] - 60s 227ms/step - loss: 0.2702 - accuracy: 0.9267 - val_loss: 0.4336 - val_accuracy: 0.8690
Epoch 105/200
266/266 [==============================] - 61s 229ms/step - loss: 0.2708 - accuracy: 0.9289 - val_loss: 0.4434 - val_accuracy: 0.8697
Epoch 106/200
266/266 [==============================] - 61s 229ms/step - loss: 0.2655 - accuracy: 0.9275 - val_loss: 0.4388 - val_accuracy: 0.8730
Epoch 107/200
266/266 [==============================] - 61s 228ms/step - loss: 0.2701 - accuracy: 0.9266 - val_loss: 0.4265 - val_accuracy: 0.8723
Epoch 108/200
266/266 [==============================] - 61s 230ms/step - loss: 0.2537 - accuracy: 0.9313 - val_loss: 0.4319 - val_accuracy: 0.8664
Epoch 109/200
266/266 [==============================] - 61s 229ms/step - loss: 0.2538 - accuracy: 0.9371 - val_loss: 0.4255 - val_accuracy: 0.8737
Epoch 110/200
266/266 [==============================] - 61s 230ms/step - loss: 0.2447 - accuracy: 0.9335 - val_loss: 0.4306 - val_accuracy: 0.8664
Epoch 111/200
266/266 [==============================] - 61s 230ms/step - loss: 0.2598 - accuracy: 0.9346 - val_loss: 0.4237 - val_accuracy: 0.8690
Epoch 112/200
266/266 [==============================] - 61s 229ms/step - loss: 0.2435 - accuracy: 0.9365 - val_loss: 0.4264 - val_accuracy: 0.8677
Epoch 113/200
266/266 [==============================] - 61s 229ms/step - loss: 0.2533 - accuracy: 0.9342 - val_loss: 0.4193 - val_accuracy: 0.8757
Epoch 114/200
266/266 [==============================] - 61s 231ms/step - loss: 0.2389 - accuracy: 0.9360 - val_loss: 0.4217 - val_accuracy: 0.8670
Epoch 115/200
266/266 [==============================] - 61s 230ms/step - loss: 0.2404 - accuracy: 0.9386 - val_loss: 0.4202 - val_accuracy: 0.8743
Epoch 116/200
266/266 [==============================] - 61s 230ms/step - loss: 0.2294 - accuracy: 0.9396 - val_loss: 0.4231 - val_accuracy: 0.8777
Epoch 117/200
266/266 [==============================] - 61s 230ms/step - loss: 0.2399 - accuracy: 0.9371 - val_loss: 0.4241 - val_accuracy: 0.8737
Epoch 118/200
266/266 [==============================] - 61s 228ms/step - loss: 0.2318 - accuracy: 0.9376 - val_loss: 0.4217 - val_accuracy: 0.8783
Epoch 119/200
266/266 [==============================] - 61s 229ms/step - loss: 0.2196 - accuracy: 0.9469 - val_loss: 0.4222 - val_accuracy: 0.8657
Epoch 120/200
266/266 [==============================] - 61s 230ms/step - loss: 0.2250 - accuracy: 0.9420 - val_loss: 0.4217 - val_accuracy: 0.8757
Epoch 121/200
266/266 [==============================] - 61s 229ms/step - loss: 0.2238 - accuracy: 0.9427 - val_loss: 0.4220 - val_accuracy: 0.8657
Epoch 122/200
266/266 [==============================] - 61s 230ms/step - loss: 0.2265 - accuracy: 0.9428 - val_loss: 0.4176 - val_accuracy: 0.8703
Epoch 123/200
266/266 [==============================] - 61s 229ms/step - loss: 0.2080 - accuracy: 0.9505 - val_loss: 0.4102 - val_accuracy: 0.8743
Epoch 124/200
266/266 [==============================] - 61s 228ms/step - loss: 0.2247 - accuracy: 0.9410 - val_loss: 0.4186 - val_accuracy: 0.8717
Epoch 125/200
266/266 [==============================] - 61s 230ms/step - loss: 0.2232 - accuracy: 0.9423 - val_loss: 0.4145 - val_accuracy: 0.8697
Epoch 126/200
266/266 [==============================] - 61s 230ms/step - loss: 0.2149 - accuracy: 0.9431 - val_loss: 0.4165 - val_accuracy: 0.8723
Epoch 127/200
266/266 [==============================] - 61s 231ms/step - loss: 0.2141 - accuracy: 0.9449 - val_loss: 0.4112 - val_accuracy: 0.8737
Epoch 128/200
266/266 [==============================] - 61s 229ms/step - loss: 0.1990 - accuracy: 0.9521 - val_loss: 0.4114 - val_accuracy: 0.8730
Epoch 129/200
266/266 [==============================] - 61s 230ms/step - loss: 0.1982 - accuracy: 0.9516 - val_loss: 0.4092 - val_accuracy: 0.8750
Epoch 130/200
266/266 [==============================] - 61s 228ms/step - loss: 0.2090 - accuracy: 0.9454 - val_loss: 0.4208 - val_accuracy: 0.8657
Epoch 131/200
266/266 [==============================] - 61s 230ms/step - loss: 0.1912 - accuracy: 0.9534 - val_loss: 0.4123 - val_accuracy: 0.8750
Epoch 132/200
266/266 [==============================] - 61s 230ms/step - loss: 0.1982 - accuracy: 0.9473 - val_loss: 0.4187 - val_accuracy: 0.8710
Epoch 133/200
266/266 [==============================] - 62s 231ms/step - loss: 0.2109 - accuracy: 0.9448 - val_loss: 0.4105 - val_accuracy: 0.8684
Epoch 134/200
266/266 [==============================] - 61s 229ms/step - loss: 0.1942 - accuracy: 0.9534 - val_loss: 0.4070 - val_accuracy: 0.8777
Epoch 135/200
266/266 [==============================] - 61s 230ms/step - loss: 0.2020 - accuracy: 0.9486 - val_loss: 0.4058 - val_accuracy: 0.8783
Epoch 136/200
266/266 [==============================] - 61s 230ms/step - loss: 0.1977 - accuracy: 0.9504 - val_loss: 0.4048 - val_accuracy: 0.8737
Epoch 137/200
266/266 [==============================] - 61s 229ms/step - loss: 0.1941 - accuracy: 0.9525 - val_loss: 0.4070 - val_accuracy: 0.8703
Epoch 138/200
266/266 [==============================] - 61s 231ms/step - loss: 0.1914 - accuracy: 0.9540 - val_loss: 0.4087 - val_accuracy: 0.8743
Epoch 139/200
266/266 [==============================] - 61s 230ms/step - loss: 0.1918 - accuracy: 0.9549 - val_loss: 0.4045 - val_accuracy: 0.8710
Epoch 140/200
266/266 [==============================] - 62s 231ms/step - loss: 0.1951 - accuracy: 0.9503 - val_loss: 0.4038 - val_accuracy: 0.8743
Epoch 141/200
266/266 [==============================] - 61s 231ms/step - loss: 0.1838 - accuracy: 0.9561 - val_loss: 0.4035 - val_accuracy: 0.8723
Epoch 142/200
266/266 [==============================] - 61s 231ms/step - loss: 0.1793 - accuracy: 0.9577 - val_loss: 0.4037 - val_accuracy: 0.8730
Epoch 143/200
266/266 [==============================] - 61s 229ms/step - loss: 0.1807 - accuracy: 0.9550 - val_loss: 0.4141 - val_accuracy: 0.8690
Epoch 144/200
266/266 [==============================] - 61s 231ms/step - loss: 0.1825 - accuracy: 0.9538 - val_loss: 0.4031 - val_accuracy: 0.8737
Epoch 145/200
266/266 [==============================] - 61s 229ms/step - loss: 0.1839 - accuracy: 0.9574 - val_loss: 0.4036 - val_accuracy: 0.8743
Epoch 146/200
266/266 [==============================] - 61s 229ms/step - loss: 0.1779 - accuracy: 0.9533 - val_loss: 0.4012 - val_accuracy: 0.8730
Epoch 147/200
266/266 [==============================] - 61s 230ms/step - loss: 0.1712 - accuracy: 0.9575 - val_loss: 0.4017 - val_accuracy: 0.8717
Epoch 148/200
266/266 [==============================] - 61s 230ms/step - loss: 0.1732 - accuracy: 0.9591 - val_loss: 0.4084 - val_accuracy: 0.8690
Epoch 149/200
266/266 [==============================] - 61s 229ms/step - loss: 0.1761 - accuracy: 0.9591 - val_loss: 0.3975 - val_accuracy: 0.8743
Epoch 150/200
266/266 [==============================] - 60s 227ms/step - loss: 0.1775 - accuracy: 0.9550 - val_loss: 0.4029 - val_accuracy: 0.8730
Epoch 151/200
266/266 [==============================] - 61s 231ms/step - loss: 0.1721 - accuracy: 0.9562 - val_loss: 0.4015 - val_accuracy: 0.8703
Epoch 152/200
266/266 [==============================] - 61s 231ms/step - loss: 0.1661 - accuracy: 0.9599 - val_loss: 0.3966 - val_accuracy: 0.8797
Epoch 153/200
266/266 [==============================] - 61s 231ms/step - loss: 0.1705 - accuracy: 0.9586 - val_loss: 0.3984 - val_accuracy: 0.8750
Epoch 154/200
266/266 [==============================] - 61s 229ms/step - loss: 0.1646 - accuracy: 0.9626 - val_loss: 0.3982 - val_accuracy: 0.8730
Epoch 155/200
266/266 [==============================] - 61s 229ms/step - loss: 0.1608 - accuracy: 0.9601 - val_loss: 0.3905 - val_accuracy: 0.8770
Epoch 156/200
266/266 [==============================] - 61s 230ms/step - loss: 0.1645 - accuracy: 0.9618 - val_loss: 0.3899 - val_accuracy: 0.8797
Epoch 157/200
266/266 [==============================] - 61s 230ms/step - loss: 0.1551 - accuracy: 0.9641 - val_loss: 0.3920 - val_accuracy: 0.8783
Epoch 158/200
266/266 [==============================] - 61s 229ms/step - loss: 0.1629 - accuracy: 0.9622 - val_loss: 0.3922 - val_accuracy: 0.8790
Epoch 159/200
266/266 [==============================] - 61s 228ms/step - loss: 0.1582 - accuracy: 0.9638 - val_loss: 0.3958 - val_accuracy: 0.8790
Epoch 160/200
266/266 [==============================] - 61s 228ms/step - loss: 0.1422 - accuracy: 0.9670 - val_loss: 0.3950 - val_accuracy: 0.8737
Epoch 161/200
266/266 [==============================] - 61s 229ms/step - loss: 0.1517 - accuracy: 0.9639 - val_loss: 0.3968 - val_accuracy: 0.8730
Epoch 162/200
266/266 [==============================] - 61s 229ms/step - loss: 0.1432 - accuracy: 0.9693 - val_loss: 0.4046 - val_accuracy: 0.8763
Epoch 163/200
266/266 [==============================] - 61s 229ms/step - loss: 0.1478 - accuracy: 0.9672 - val_loss: 0.3924 - val_accuracy: 0.8777
Epoch 164/200
266/266 [==============================] - 61s 228ms/step - loss: 0.1501 - accuracy: 0.9639 - val_loss: 0.3912 - val_accuracy: 0.8790
Epoch 165/200
266/266 [==============================] - 61s 228ms/step - loss: 0.1537 - accuracy: 0.9641 - val_loss: 0.3985 - val_accuracy: 0.8750
Epoch 166/200
266/266 [==============================] - 61s 228ms/step - loss: 0.1527 - accuracy: 0.9661 - val_loss: 0.3965 - val_accuracy: 0.8717
Epoch 167/200
266/266 [==============================] - 61s 229ms/step - loss: 0.1439 - accuracy: 0.9673 - val_loss: 0.3933 - val_accuracy: 0.8763
Epoch 168/200
266/266 [==============================] - 60s 227ms/step - loss: 0.1453 - accuracy: 0.9650 - val_loss: 0.3928 - val_accuracy: 0.8750
Epoch 169/200
266/266 [==============================] - 61s 230ms/step - loss: 0.1495 - accuracy: 0.9663 - val_loss: 0.3893 - val_accuracy: 0.8737
Epoch 170/200
266/266 [==============================] - 61s 230ms/step - loss: 0.1418 - accuracy: 0.9699 - val_loss: 0.3912 - val_accuracy: 0.8723
Epoch 171/200
266/266 [==============================] - 61s 230ms/step - loss: 0.1431 - accuracy: 0.9676 - val_loss: 0.3921 - val_accuracy: 0.8757
Epoch 172/200
266/266 [==============================] - 61s 228ms/step - loss: 0.1531 - accuracy: 0.9649 - val_loss: 0.3900 - val_accuracy: 0.8770
Epoch 173/200
266/266 [==============================] - 61s 230ms/step - loss: 0.1482 - accuracy: 0.9658 - val_loss: 0.3919 - val_accuracy: 0.8750
Epoch 174/200
266/266 [==============================] - 61s 230ms/step - loss: 0.1456 - accuracy: 0.9672 - val_loss: 0.3871 - val_accuracy: 0.8797
Epoch 175/200
266/266 [==============================] - 61s 229ms/step - loss: 0.1439 - accuracy: 0.9672 - val_loss: 0.3885 - val_accuracy: 0.8777
Epoch 176/200
266/266 [==============================] - 61s 231ms/step - loss: 0.1457 - accuracy: 0.9670 - val_loss: 0.3877 - val_accuracy: 0.8783
Epoch 177/200
266/266 [==============================] - 61s 229ms/step - loss: 0.1359 - accuracy: 0.9692 - val_loss: 0.3942 - val_accuracy: 0.8723
Epoch 178/200
266/266 [==============================] - 61s 229ms/step - loss: 0.1273 - accuracy: 0.9721 - val_loss: 0.3857 - val_accuracy: 0.8770
Epoch 179/200
266/266 [==============================] - 61s 228ms/step - loss: 0.1253 - accuracy: 0.9737 - val_loss: 0.3869 - val_accuracy: 0.8830
Epoch 180/200
266/266 [==============================] - 61s 229ms/step - loss: 0.1374 - accuracy: 0.9672 - val_loss: 0.3910 - val_accuracy: 0.8790
Epoch 181/200
266/266 [==============================] - 61s 229ms/step - loss: 0.1357 - accuracy: 0.9687 - val_loss: 0.3896 - val_accuracy: 0.8803
Epoch 182/200
266/266 [==============================] - 61s 230ms/step - loss: 0.1351 - accuracy: 0.9700 - val_loss: 0.3885 - val_accuracy: 0.8777
Epoch 183/200
266/266 [==============================] - 61s 228ms/step - loss: 0.1282 - accuracy: 0.9696 - val_loss: 0.3893 - val_accuracy: 0.8763
Epoch 184/200
266/266 [==============================] - 61s 230ms/step - loss: 0.1243 - accuracy: 0.9735 - val_loss: 0.3908 - val_accuracy: 0.8777
Epoch 185/200
266/266 [==============================] - 61s 229ms/step - loss: 0.1324 - accuracy: 0.9705 - val_loss: 0.3920 - val_accuracy: 0.8763
Epoch 186/200
266/266 [==============================] - 61s 229ms/step - loss: 0.1266 - accuracy: 0.9728 - val_loss: 0.3847 - val_accuracy: 0.8797
Epoch 187/200
266/266 [==============================] - 61s 229ms/step - loss: 0.1209 - accuracy: 0.9718 - val_loss: 0.3920 - val_accuracy: 0.8743
Epoch 188/200
266/266 [==============================] - 61s 229ms/step - loss: 0.1210 - accuracy: 0.9766 - val_loss: 0.3828 - val_accuracy: 0.8810
Epoch 189/200
266/266 [==============================] - 61s 229ms/step - loss: 0.1187 - accuracy: 0.9752 - val_loss: 0.3892 - val_accuracy: 0.8783
Epoch 190/200
266/266 [==============================] - 61s 230ms/step - loss: 0.1273 - accuracy: 0.9721 - val_loss: 0.3870 - val_accuracy: 0.8757
Epoch 191/200
266/266 [==============================] - 61s 228ms/step - loss: 0.1205 - accuracy: 0.9758 - val_loss: 0.3890 - val_accuracy: 0.8777
Epoch 192/200
266/266 [==============================] - 61s 229ms/step - loss: 0.1152 - accuracy: 0.9730 - val_loss: 0.3848 - val_accuracy: 0.8763
Epoch 193/200
266/266 [==============================] - 61s 230ms/step - loss: 0.1204 - accuracy: 0.9742 - val_loss: 0.3837 - val_accuracy: 0.8836
Epoch 194/200
266/266 [==============================] - 61s 229ms/step - loss: 0.1193 - accuracy: 0.9703 - val_loss: 0.3846 - val_accuracy: 0.8763
Epoch 195/200
266/266 [==============================] - 61s 229ms/step - loss: 0.1112 - accuracy: 0.9785 - val_loss: 0.3839 - val_accuracy: 0.8836
Epoch 196/200
266/266 [==============================] - 61s 231ms/step - loss: 0.1093 - accuracy: 0.9781 - val_loss: 0.3840 - val_accuracy: 0.8797
Epoch 197/200
266/266 [==============================] - 61s 229ms/step - loss: 0.1145 - accuracy: 0.9776 - val_loss: 0.3840 - val_accuracy: 0.8797
Epoch 198/200
266/266 [==============================] - 61s 229ms/step - loss: 0.1153 - accuracy: 0.9768 - val_loss: 0.3910 - val_accuracy: 0.8743
Epoch 199/200
266/266 [==============================] - 61s 229ms/step - loss: 0.1102 - accuracy: 0.9762 - val_loss: 0.3878 - val_accuracy: 0.8770
Epoch 200/200
266/266 [==============================] - 61s 230ms/step - loss: 0.1161 - accuracy: 0.9753 - val_loss: 0.3816 - val_accuracy: 0.8797
In [ ]:
_, accuracy = model_report(MobileNetV2_MODEL_OPTIMIZED, MobileNetV2_MODEL_OPTIMIZED_history, test_ds_res)
accuracies_opt_SGD["MOBILENET_ALL"] = accuracy
Test set evaluation metrics
---------------------------
Loss:     0.386
Accuracy: 87.946%
DenseNet
In [ ]:
DENSENET_MODEL_OPTIMIZED = init_DENSENET_model_optimized(True, optimizer = tf.optimizers.SGD)
DENSENET_MODEL_OPTIMIZED_history = train_model(DENSENET_MODEL_OPTIMIZED, epochs = 200, callbacks=[callback])
Downloading data from https://storage.googleapis.com/tensorflow/keras-applications/densenet/densenet121_weights_tf_dim_ordering_tf_kernels_notop.h5
29089792/29084464 [==============================] - 0s 0us/step
Model: "sequential_3"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
densenet121 (Functional)     (None, 1, 1, 1024)        7037504   
_________________________________________________________________
dropout_3 (Dropout)          (None, 1, 1, 1024)        0         
_________________________________________________________________
global_average_pooling2d_3 ( (None, 1024)              0         
_________________________________________________________________
dense_3 (Dense)              (None, 20)                20500     
=================================================================
Total params: 7,058,004
Trainable params: 6,974,356
Non-trainable params: 83,648
_________________________________________________________________
Epoch 1/200
266/266 [==============================] - 18s 35ms/step - loss: 4.7415 - accuracy: 0.0455 - val_loss: 3.2504 - val_accuracy: 0.0605
Epoch 2/200
266/266 [==============================] - 8s 29ms/step - loss: 4.2008 - accuracy: 0.0723 - val_loss: 3.2000 - val_accuracy: 0.1170
Epoch 3/200
266/266 [==============================] - 8s 29ms/step - loss: 3.8160 - accuracy: 0.1035 - val_loss: 2.9576 - val_accuracy: 0.1789
Epoch 4/200
266/266 [==============================] - 8s 29ms/step - loss: 3.4101 - accuracy: 0.1408 - val_loss: 2.7331 - val_accuracy: 0.2274
Epoch 5/200
266/266 [==============================] - 8s 29ms/step - loss: 3.1596 - accuracy: 0.1808 - val_loss: 2.4787 - val_accuracy: 0.2786
Epoch 6/200
266/266 [==============================] - 8s 29ms/step - loss: 2.9283 - accuracy: 0.2128 - val_loss: 2.3272 - val_accuracy: 0.3271
Epoch 7/200
266/266 [==============================] - 8s 29ms/step - loss: 2.8170 - accuracy: 0.2475 - val_loss: 2.2203 - val_accuracy: 0.3637
Epoch 8/200
266/266 [==============================] - 8s 29ms/step - loss: 2.6658 - accuracy: 0.2622 - val_loss: 2.1395 - val_accuracy: 0.3890
Epoch 9/200
266/266 [==============================] - 8s 29ms/step - loss: 2.5281 - accuracy: 0.2826 - val_loss: 2.0504 - val_accuracy: 0.4202
Epoch 10/200
266/266 [==============================] - 8s 29ms/step - loss: 2.4097 - accuracy: 0.3073 - val_loss: 1.9782 - val_accuracy: 0.4322
Epoch 11/200
266/266 [==============================] - 8s 29ms/step - loss: 2.3416 - accuracy: 0.3179 - val_loss: 1.9062 - val_accuracy: 0.4648
Epoch 12/200
266/266 [==============================] - 8s 29ms/step - loss: 2.2730 - accuracy: 0.3443 - val_loss: 1.8502 - val_accuracy: 0.4827
Epoch 13/200
266/266 [==============================] - 8s 29ms/step - loss: 2.1823 - accuracy: 0.3649 - val_loss: 1.7944 - val_accuracy: 0.4894
Epoch 14/200
266/266 [==============================] - 8s 29ms/step - loss: 2.1027 - accuracy: 0.3786 - val_loss: 1.7509 - val_accuracy: 0.5093
Epoch 15/200
266/266 [==============================] - 8s 29ms/step - loss: 2.0879 - accuracy: 0.3875 - val_loss: 1.7025 - val_accuracy: 0.5113
Epoch 16/200
266/266 [==============================] - 8s 29ms/step - loss: 1.9917 - accuracy: 0.4109 - val_loss: 1.6619 - val_accuracy: 0.5199
Epoch 17/200
266/266 [==============================] - 8s 29ms/step - loss: 1.9510 - accuracy: 0.4176 - val_loss: 1.6256 - val_accuracy: 0.5279
Epoch 18/200
266/266 [==============================] - 8s 29ms/step - loss: 1.8717 - accuracy: 0.4466 - val_loss: 1.5873 - val_accuracy: 0.5406
Epoch 19/200
266/266 [==============================] - 8s 29ms/step - loss: 1.8403 - accuracy: 0.4601 - val_loss: 1.5566 - val_accuracy: 0.5512
Epoch 20/200
266/266 [==============================] - 8s 29ms/step - loss: 1.8325 - accuracy: 0.4644 - val_loss: 1.5221 - val_accuracy: 0.5598
Epoch 21/200
266/266 [==============================] - 8s 29ms/step - loss: 1.7381 - accuracy: 0.4934 - val_loss: 1.4937 - val_accuracy: 0.5592
Epoch 22/200
266/266 [==============================] - 8s 29ms/step - loss: 1.7299 - accuracy: 0.4898 - val_loss: 1.4704 - val_accuracy: 0.5711
Epoch 23/200
266/266 [==============================] - 8s 29ms/step - loss: 1.6574 - accuracy: 0.5121 - val_loss: 1.4459 - val_accuracy: 0.5785
Epoch 24/200
266/266 [==============================] - 8s 29ms/step - loss: 1.6585 - accuracy: 0.5112 - val_loss: 1.4254 - val_accuracy: 0.5851
Epoch 25/200
266/266 [==============================] - 8s 29ms/step - loss: 1.6068 - accuracy: 0.5158 - val_loss: 1.4112 - val_accuracy: 0.5871
Epoch 26/200
266/266 [==============================] - 8s 29ms/step - loss: 1.5918 - accuracy: 0.5245 - val_loss: 1.3868 - val_accuracy: 0.5924
Epoch 27/200
266/266 [==============================] - 8s 29ms/step - loss: 1.5901 - accuracy: 0.5275 - val_loss: 1.3688 - val_accuracy: 0.6011
Epoch 28/200
266/266 [==============================] - 8s 29ms/step - loss: 1.5044 - accuracy: 0.5496 - val_loss: 1.3450 - val_accuracy: 0.6051
Epoch 29/200
266/266 [==============================] - 8s 29ms/step - loss: 1.4720 - accuracy: 0.5577 - val_loss: 1.3304 - val_accuracy: 0.6137
Epoch 30/200
266/266 [==============================] - 8s 30ms/step - loss: 1.4594 - accuracy: 0.5628 - val_loss: 1.3028 - val_accuracy: 0.6184
Epoch 31/200
266/266 [==============================] - 8s 29ms/step - loss: 1.4476 - accuracy: 0.5687 - val_loss: 1.2919 - val_accuracy: 0.6210
Epoch 32/200
266/266 [==============================] - 8s 29ms/step - loss: 1.4134 - accuracy: 0.5658 - val_loss: 1.2755 - val_accuracy: 0.6343
Epoch 33/200
266/266 [==============================] - 8s 29ms/step - loss: 1.3972 - accuracy: 0.5789 - val_loss: 1.2609 - val_accuracy: 0.6316
Epoch 34/200
266/266 [==============================] - 8s 29ms/step - loss: 1.3830 - accuracy: 0.5821 - val_loss: 1.2525 - val_accuracy: 0.6370
Epoch 35/200
266/266 [==============================] - 8s 29ms/step - loss: 1.3766 - accuracy: 0.5836 - val_loss: 1.2443 - val_accuracy: 0.6416
Epoch 36/200
266/266 [==============================] - 8s 29ms/step - loss: 1.3348 - accuracy: 0.5930 - val_loss: 1.2256 - val_accuracy: 0.6396
Epoch 37/200
266/266 [==============================] - 8s 29ms/step - loss: 1.3490 - accuracy: 0.5980 - val_loss: 1.2101 - val_accuracy: 0.6496
Epoch 38/200
266/266 [==============================] - 8s 29ms/step - loss: 1.3084 - accuracy: 0.6113 - val_loss: 1.2002 - val_accuracy: 0.6543
Epoch 39/200
266/266 [==============================] - 8s 29ms/step - loss: 1.2951 - accuracy: 0.6180 - val_loss: 1.1942 - val_accuracy: 0.6616
Epoch 40/200
266/266 [==============================] - 8s 30ms/step - loss: 1.2666 - accuracy: 0.6194 - val_loss: 1.1846 - val_accuracy: 0.6562
Epoch 41/200
266/266 [==============================] - 8s 29ms/step - loss: 1.2739 - accuracy: 0.6175 - val_loss: 1.1779 - val_accuracy: 0.6622
Epoch 42/200
266/266 [==============================] - 8s 29ms/step - loss: 1.2559 - accuracy: 0.6181 - val_loss: 1.1642 - val_accuracy: 0.6636
Epoch 43/200
266/266 [==============================] - 8s 29ms/step - loss: 1.2034 - accuracy: 0.6337 - val_loss: 1.1541 - val_accuracy: 0.6649
Epoch 44/200
266/266 [==============================] - 8s 30ms/step - loss: 1.2262 - accuracy: 0.6225 - val_loss: 1.1469 - val_accuracy: 0.6715
Epoch 45/200
266/266 [==============================] - 8s 29ms/step - loss: 1.1859 - accuracy: 0.6397 - val_loss: 1.1371 - val_accuracy: 0.6709
Epoch 46/200
266/266 [==============================] - 8s 29ms/step - loss: 1.1867 - accuracy: 0.6390 - val_loss: 1.1272 - val_accuracy: 0.6769
Epoch 47/200
266/266 [==============================] - 8s 29ms/step - loss: 1.1301 - accuracy: 0.6566 - val_loss: 1.1181 - val_accuracy: 0.6775
Epoch 48/200
266/266 [==============================] - 8s 30ms/step - loss: 1.1207 - accuracy: 0.6596 - val_loss: 1.1205 - val_accuracy: 0.6782
Epoch 49/200
266/266 [==============================] - 8s 29ms/step - loss: 1.1385 - accuracy: 0.6526 - val_loss: 1.1075 - val_accuracy: 0.6789
Epoch 50/200
266/266 [==============================] - 8s 30ms/step - loss: 1.1203 - accuracy: 0.6654 - val_loss: 1.0873 - val_accuracy: 0.6822
Epoch 51/200
266/266 [==============================] - 8s 29ms/step - loss: 1.0919 - accuracy: 0.6664 - val_loss: 1.0947 - val_accuracy: 0.6802
Epoch 52/200
266/266 [==============================] - 8s 29ms/step - loss: 1.0800 - accuracy: 0.6643 - val_loss: 1.0877 - val_accuracy: 0.6848
Epoch 53/200
266/266 [==============================] - 8s 29ms/step - loss: 1.0787 - accuracy: 0.6701 - val_loss: 1.0764 - val_accuracy: 0.6875
Epoch 54/200
266/266 [==============================] - 8s 29ms/step - loss: 1.0508 - accuracy: 0.6835 - val_loss: 1.0761 - val_accuracy: 0.6882
Epoch 55/200
266/266 [==============================] - 8s 30ms/step - loss: 1.0623 - accuracy: 0.6766 - val_loss: 1.0736 - val_accuracy: 0.6862
Epoch 56/200
266/266 [==============================] - 8s 30ms/step - loss: 1.0367 - accuracy: 0.6887 - val_loss: 1.0635 - val_accuracy: 0.6981
Epoch 57/200
266/266 [==============================] - 8s 30ms/step - loss: 1.0374 - accuracy: 0.6774 - val_loss: 1.0625 - val_accuracy: 0.6908
Epoch 58/200
266/266 [==============================] - 8s 29ms/step - loss: 1.0172 - accuracy: 0.6949 - val_loss: 1.0522 - val_accuracy: 0.6941
Epoch 59/200
266/266 [==============================] - 8s 29ms/step - loss: 1.0208 - accuracy: 0.6878 - val_loss: 1.0483 - val_accuracy: 0.6948
Epoch 60/200
266/266 [==============================] - 8s 30ms/step - loss: 0.9965 - accuracy: 0.6944 - val_loss: 1.0468 - val_accuracy: 0.6955
Epoch 61/200
266/266 [==============================] - 8s 30ms/step - loss: 0.9362 - accuracy: 0.7055 - val_loss: 1.0421 - val_accuracy: 0.6968
Epoch 62/200
266/266 [==============================] - 8s 29ms/step - loss: 0.9570 - accuracy: 0.7047 - val_loss: 1.0405 - val_accuracy: 0.7028
Epoch 63/200
266/266 [==============================] - 8s 29ms/step - loss: 0.9499 - accuracy: 0.6972 - val_loss: 1.0341 - val_accuracy: 0.6988
Epoch 64/200
266/266 [==============================] - 8s 29ms/step - loss: 0.9366 - accuracy: 0.7150 - val_loss: 1.0246 - val_accuracy: 0.7028
Epoch 65/200
266/266 [==============================] - 8s 30ms/step - loss: 0.9684 - accuracy: 0.6995 - val_loss: 1.0206 - val_accuracy: 0.7035
Epoch 66/200
266/266 [==============================] - 8s 29ms/step - loss: 0.9137 - accuracy: 0.7183 - val_loss: 1.0213 - val_accuracy: 0.7055
Epoch 67/200
266/266 [==============================] - 8s 29ms/step - loss: 0.9055 - accuracy: 0.7199 - val_loss: 1.0144 - val_accuracy: 0.7081
Epoch 68/200
266/266 [==============================] - 8s 30ms/step - loss: 0.9117 - accuracy: 0.7177 - val_loss: 1.0127 - val_accuracy: 0.7041
Epoch 69/200
266/266 [==============================] - 8s 29ms/step - loss: 0.8906 - accuracy: 0.7231 - val_loss: 1.0020 - val_accuracy: 0.7088
Epoch 70/200
266/266 [==============================] - 8s 30ms/step - loss: 0.8611 - accuracy: 0.7316 - val_loss: 1.0053 - val_accuracy: 0.7068
Epoch 71/200
266/266 [==============================] - 8s 30ms/step - loss: 0.8978 - accuracy: 0.7213 - val_loss: 1.0062 - val_accuracy: 0.7108
Epoch 72/200
266/266 [==============================] - 8s 29ms/step - loss: 0.8772 - accuracy: 0.7321 - val_loss: 1.0048 - val_accuracy: 0.7088
Epoch 73/200
266/266 [==============================] - 8s 30ms/step - loss: 0.8590 - accuracy: 0.7293 - val_loss: 1.0008 - val_accuracy: 0.7055
Epoch 74/200
266/266 [==============================] - 8s 29ms/step - loss: 0.8588 - accuracy: 0.7382 - val_loss: 0.9986 - val_accuracy: 0.7128
Epoch 75/200
266/266 [==============================] - 8s 30ms/step - loss: 0.8687 - accuracy: 0.7310 - val_loss: 0.9855 - val_accuracy: 0.7141
Epoch 76/200
266/266 [==============================] - 8s 29ms/step - loss: 0.8302 - accuracy: 0.7509 - val_loss: 0.9861 - val_accuracy: 0.7148
Epoch 77/200
266/266 [==============================] - 8s 29ms/step - loss: 0.8340 - accuracy: 0.7442 - val_loss: 0.9825 - val_accuracy: 0.7121
Epoch 78/200
266/266 [==============================] - 8s 30ms/step - loss: 0.8217 - accuracy: 0.7467 - val_loss: 0.9811 - val_accuracy: 0.7161
Epoch 79/200
266/266 [==============================] - 8s 30ms/step - loss: 0.8009 - accuracy: 0.7463 - val_loss: 0.9777 - val_accuracy: 0.7154
Epoch 80/200
266/266 [==============================] - 8s 30ms/step - loss: 0.7969 - accuracy: 0.7401 - val_loss: 0.9723 - val_accuracy: 0.7174
Epoch 81/200
266/266 [==============================] - 8s 30ms/step - loss: 0.8103 - accuracy: 0.7528 - val_loss: 0.9744 - val_accuracy: 0.7201
Epoch 82/200
266/266 [==============================] - 8s 30ms/step - loss: 0.7693 - accuracy: 0.7528 - val_loss: 0.9688 - val_accuracy: 0.7188
Epoch 83/200
266/266 [==============================] - 8s 30ms/step - loss: 0.7879 - accuracy: 0.7553 - val_loss: 0.9677 - val_accuracy: 0.7194
Epoch 84/200
266/266 [==============================] - 8s 29ms/step - loss: 0.7887 - accuracy: 0.7538 - val_loss: 0.9596 - val_accuracy: 0.7267
Epoch 85/200
266/266 [==============================] - 8s 30ms/step - loss: 0.7279 - accuracy: 0.7664 - val_loss: 0.9601 - val_accuracy: 0.7234
Epoch 86/200
266/266 [==============================] - 8s 30ms/step - loss: 0.7789 - accuracy: 0.7597 - val_loss: 0.9659 - val_accuracy: 0.7207
Epoch 87/200
266/266 [==============================] - 8s 30ms/step - loss: 0.7552 - accuracy: 0.7639 - val_loss: 0.9657 - val_accuracy: 0.7181
Epoch 88/200
266/266 [==============================] - 8s 29ms/step - loss: 0.7280 - accuracy: 0.7689 - val_loss: 0.9573 - val_accuracy: 0.7181
Epoch 89/200
266/266 [==============================] - 8s 29ms/step - loss: 0.7398 - accuracy: 0.7613 - val_loss: 0.9557 - val_accuracy: 0.7227
Epoch 90/200
266/266 [==============================] - 8s 30ms/step - loss: 0.7361 - accuracy: 0.7682 - val_loss: 0.9561 - val_accuracy: 0.7168
Epoch 91/200
266/266 [==============================] - 8s 30ms/step - loss: 0.7317 - accuracy: 0.7619 - val_loss: 0.9575 - val_accuracy: 0.7214
Epoch 92/200
266/266 [==============================] - 8s 30ms/step - loss: 0.6985 - accuracy: 0.7803 - val_loss: 0.9627 - val_accuracy: 0.7227
Epoch 93/200
266/266 [==============================] - 8s 30ms/step - loss: 0.7005 - accuracy: 0.7804 - val_loss: 0.9583 - val_accuracy: 0.7221
Epoch 94/200
266/266 [==============================] - 8s 30ms/step - loss: 0.6878 - accuracy: 0.7843 - val_loss: 0.9520 - val_accuracy: 0.7234
Epoch 95/200
266/266 [==============================] - 8s 30ms/step - loss: 0.6914 - accuracy: 0.7839 - val_loss: 0.9546 - val_accuracy: 0.7261
Epoch 96/200
266/266 [==============================] - 8s 30ms/step - loss: 0.6609 - accuracy: 0.7953 - val_loss: 0.9434 - val_accuracy: 0.7261
Epoch 97/200
266/266 [==============================] - 8s 30ms/step - loss: 0.6961 - accuracy: 0.7838 - val_loss: 0.9426 - val_accuracy: 0.7261
Epoch 98/200
266/266 [==============================] - 8s 30ms/step - loss: 0.7062 - accuracy: 0.7705 - val_loss: 0.9422 - val_accuracy: 0.7261
Epoch 99/200
266/266 [==============================] - 8s 29ms/step - loss: 0.6776 - accuracy: 0.7840 - val_loss: 0.9433 - val_accuracy: 0.7347
Epoch 100/200
266/266 [==============================] - 8s 30ms/step - loss: 0.6332 - accuracy: 0.8034 - val_loss: 0.9347 - val_accuracy: 0.7327
Epoch 101/200
266/266 [==============================] - 8s 30ms/step - loss: 0.6379 - accuracy: 0.7988 - val_loss: 0.9326 - val_accuracy: 0.7314
Epoch 102/200
266/266 [==============================] - 8s 30ms/step - loss: 0.6501 - accuracy: 0.8002 - val_loss: 0.9294 - val_accuracy: 0.7327
Epoch 103/200
266/266 [==============================] - 8s 30ms/step - loss: 0.6336 - accuracy: 0.7979 - val_loss: 0.9362 - val_accuracy: 0.7261
Epoch 104/200
266/266 [==============================] - 8s 30ms/step - loss: 0.6409 - accuracy: 0.7977 - val_loss: 0.9295 - val_accuracy: 0.7294
Epoch 105/200
266/266 [==============================] - 8s 30ms/step - loss: 0.6301 - accuracy: 0.8015 - val_loss: 0.9364 - val_accuracy: 0.7307
Epoch 106/200
266/266 [==============================] - 8s 30ms/step - loss: 0.6284 - accuracy: 0.8082 - val_loss: 0.9305 - val_accuracy: 0.7340
Epoch 107/200
266/266 [==============================] - 8s 30ms/step - loss: 0.5987 - accuracy: 0.8083 - val_loss: 0.9251 - val_accuracy: 0.7314
Epoch 108/200
266/266 [==============================] - 8s 30ms/step - loss: 0.6274 - accuracy: 0.8093 - val_loss: 0.9285 - val_accuracy: 0.7281
Epoch 109/200
266/266 [==============================] - 8s 30ms/step - loss: 0.5909 - accuracy: 0.8099 - val_loss: 0.9268 - val_accuracy: 0.7307
Epoch 110/200
266/266 [==============================] - 8s 30ms/step - loss: 0.6161 - accuracy: 0.8149 - val_loss: 0.9242 - val_accuracy: 0.7334
Epoch 111/200
266/266 [==============================] - 8s 30ms/step - loss: 0.5886 - accuracy: 0.8190 - val_loss: 0.9213 - val_accuracy: 0.7327
Epoch 112/200
266/266 [==============================] - 8s 30ms/step - loss: 0.6225 - accuracy: 0.8015 - val_loss: 0.9297 - val_accuracy: 0.7307
Epoch 113/200
266/266 [==============================] - 8s 30ms/step - loss: 0.5839 - accuracy: 0.8199 - val_loss: 0.9230 - val_accuracy: 0.7367
Epoch 114/200
266/266 [==============================] - 8s 30ms/step - loss: 0.5584 - accuracy: 0.8257 - val_loss: 0.9224 - val_accuracy: 0.7301
Epoch 115/200
266/266 [==============================] - 8s 30ms/step - loss: 0.5618 - accuracy: 0.8199 - val_loss: 0.9244 - val_accuracy: 0.7387
Epoch 116/200
266/266 [==============================] - 8s 30ms/step - loss: 0.5649 - accuracy: 0.8277 - val_loss: 0.9231 - val_accuracy: 0.7294
Epoch 117/200
266/266 [==============================] - 8s 30ms/step - loss: 0.5551 - accuracy: 0.8262 - val_loss: 0.9249 - val_accuracy: 0.7354
Epoch 118/200
266/266 [==============================] - 8s 30ms/step - loss: 0.5487 - accuracy: 0.8255 - val_loss: 0.9280 - val_accuracy: 0.7334
Epoch 119/200
266/266 [==============================] - 8s 30ms/step - loss: 0.5447 - accuracy: 0.8298 - val_loss: 0.9229 - val_accuracy: 0.7320
Epoch 120/200
266/266 [==============================] - 8s 30ms/step - loss: 0.5784 - accuracy: 0.8203 - val_loss: 0.9239 - val_accuracy: 0.7347
Epoch 121/200
266/266 [==============================] - 8s 30ms/step - loss: 0.5655 - accuracy: 0.8296 - val_loss: 0.9222 - val_accuracy: 0.7294
Epoch 122/200
266/266 [==============================] - 8s 30ms/step - loss: 0.5486 - accuracy: 0.8274 - val_loss: 0.9223 - val_accuracy: 0.7267
Epoch 123/200
266/266 [==============================] - 8s 30ms/step - loss: 0.5427 - accuracy: 0.8296 - val_loss: 0.9218 - val_accuracy: 0.7320
Epoch 124/200
266/266 [==============================] - 8s 30ms/step - loss: 0.5232 - accuracy: 0.8341 - val_loss: 0.9137 - val_accuracy: 0.7354
Epoch 125/200
266/266 [==============================] - 8s 29ms/step - loss: 0.5241 - accuracy: 0.8333 - val_loss: 0.9134 - val_accuracy: 0.7334
Epoch 126/200
266/266 [==============================] - 8s 29ms/step - loss: 0.4944 - accuracy: 0.8388 - val_loss: 0.9185 - val_accuracy: 0.7360
Epoch 127/200
266/266 [==============================] - 8s 30ms/step - loss: 0.5109 - accuracy: 0.8348 - val_loss: 0.9189 - val_accuracy: 0.7347
Epoch 128/200
266/266 [==============================] - 8s 29ms/step - loss: 0.5075 - accuracy: 0.8429 - val_loss: 0.9179 - val_accuracy: 0.7327
Epoch 129/200
266/266 [==============================] - 8s 30ms/step - loss: 0.5092 - accuracy: 0.8392 - val_loss: 0.9216 - val_accuracy: 0.7374
Epoch 130/200
266/266 [==============================] - 8s 30ms/step - loss: 0.5135 - accuracy: 0.8362 - val_loss: 0.9256 - val_accuracy: 0.7374
Epoch 131/200
266/266 [==============================] - 8s 30ms/step - loss: 0.4872 - accuracy: 0.8498 - val_loss: 0.9246 - val_accuracy: 0.7360
Epoch 132/200
266/266 [==============================] - 8s 30ms/step - loss: 0.4813 - accuracy: 0.8496 - val_loss: 0.9151 - val_accuracy: 0.7380
Epoch 133/200
266/266 [==============================] - 8s 30ms/step - loss: 0.4699 - accuracy: 0.8522 - val_loss: 0.9194 - val_accuracy: 0.7367
Epoch 134/200
266/266 [==============================] - 8s 30ms/step - loss: 0.4792 - accuracy: 0.8508 - val_loss: 0.9137 - val_accuracy: 0.7434
Epoch 135/200
266/266 [==============================] - 8s 30ms/step - loss: 0.4636 - accuracy: 0.8529 - val_loss: 0.9185 - val_accuracy: 0.7427
Epoch 136/200
266/266 [==============================] - 8s 30ms/step - loss: 0.4719 - accuracy: 0.8523 - val_loss: 0.9128 - val_accuracy: 0.7453
Epoch 137/200
266/266 [==============================] - 8s 30ms/step - loss: 0.4530 - accuracy: 0.8642 - val_loss: 0.9103 - val_accuracy: 0.7473
Epoch 138/200
266/266 [==============================] - 8s 30ms/step - loss: 0.5015 - accuracy: 0.8420 - val_loss: 0.9097 - val_accuracy: 0.7400
Epoch 139/200
266/266 [==============================] - 8s 30ms/step - loss: 0.4942 - accuracy: 0.8489 - val_loss: 0.9166 - val_accuracy: 0.7453
Epoch 140/200
266/266 [==============================] - 8s 30ms/step - loss: 0.4615 - accuracy: 0.8541 - val_loss: 0.9109 - val_accuracy: 0.7460
Epoch 141/200
266/266 [==============================] - 8s 30ms/step - loss: 0.4534 - accuracy: 0.8554 - val_loss: 0.9153 - val_accuracy: 0.7434
Epoch 142/200
266/266 [==============================] - 8s 30ms/step - loss: 0.4511 - accuracy: 0.8534 - val_loss: 0.9134 - val_accuracy: 0.7374
Epoch 143/200
266/266 [==============================] - 8s 30ms/step - loss: 0.4581 - accuracy: 0.8502 - val_loss: 0.9141 - val_accuracy: 0.7460
Epoch 144/200
266/266 [==============================] - 8s 30ms/step - loss: 0.4433 - accuracy: 0.8541 - val_loss: 0.9120 - val_accuracy: 0.7387
Epoch 145/200
266/266 [==============================] - 8s 30ms/step - loss: 0.4519 - accuracy: 0.8620 - val_loss: 0.9116 - val_accuracy: 0.7387
Epoch 146/200
266/266 [==============================] - 8s 30ms/step - loss: 0.4462 - accuracy: 0.8608 - val_loss: 0.9050 - val_accuracy: 0.7427
Epoch 147/200
266/266 [==============================] - 8s 30ms/step - loss: 0.4382 - accuracy: 0.8601 - val_loss: 0.9151 - val_accuracy: 0.7467
Epoch 148/200
266/266 [==============================] - 8s 30ms/step - loss: 0.4403 - accuracy: 0.8594 - val_loss: 0.9149 - val_accuracy: 0.7453
Epoch 149/200
266/266 [==============================] - 8s 30ms/step - loss: 0.4204 - accuracy: 0.8668 - val_loss: 0.9100 - val_accuracy: 0.7434
Epoch 150/200
266/266 [==============================] - 8s 30ms/step - loss: 0.4080 - accuracy: 0.8723 - val_loss: 0.9157 - val_accuracy: 0.7447
Epoch 151/200
266/266 [==============================] - 8s 30ms/step - loss: 0.4003 - accuracy: 0.8756 - val_loss: 0.9101 - val_accuracy: 0.7473
Epoch 152/200
266/266 [==============================] - 8s 30ms/step - loss: 0.4078 - accuracy: 0.8694 - val_loss: 0.9113 - val_accuracy: 0.7447
Epoch 153/200
266/266 [==============================] - 8s 30ms/step - loss: 0.4170 - accuracy: 0.8692 - val_loss: 0.9089 - val_accuracy: 0.7427
Epoch 154/200
266/266 [==============================] - 8s 30ms/step - loss: 0.3977 - accuracy: 0.8734 - val_loss: 0.9022 - val_accuracy: 0.7427
Epoch 155/200
266/266 [==============================] - 8s 30ms/step - loss: 0.4159 - accuracy: 0.8733 - val_loss: 0.9090 - val_accuracy: 0.7460
Epoch 156/200
266/266 [==============================] - 8s 30ms/step - loss: 0.3919 - accuracy: 0.8766 - val_loss: 0.9149 - val_accuracy: 0.7473
Epoch 157/200
266/266 [==============================] - 8s 30ms/step - loss: 0.3655 - accuracy: 0.8825 - val_loss: 0.9113 - val_accuracy: 0.7487
Epoch 158/200
266/266 [==============================] - 8s 30ms/step - loss: 0.3896 - accuracy: 0.8801 - val_loss: 0.9204 - val_accuracy: 0.7480
Epoch 159/200
266/266 [==============================] - 8s 30ms/step - loss: 0.3893 - accuracy: 0.8768 - val_loss: 0.9128 - val_accuracy: 0.7493
Epoch 160/200
266/266 [==============================] - 8s 30ms/step - loss: 0.3745 - accuracy: 0.8825 - val_loss: 0.9109 - val_accuracy: 0.7467
Epoch 161/200
266/266 [==============================] - 8s 30ms/step - loss: 0.4195 - accuracy: 0.8690 - val_loss: 0.9153 - val_accuracy: 0.7380
Epoch 162/200
266/266 [==============================] - 8s 30ms/step - loss: 0.3818 - accuracy: 0.8797 - val_loss: 0.9231 - val_accuracy: 0.7453
Epoch 163/200
266/266 [==============================] - 8s 30ms/step - loss: 0.3886 - accuracy: 0.8794 - val_loss: 0.9133 - val_accuracy: 0.7367
Epoch 164/200
266/266 [==============================] - 8s 30ms/step - loss: 0.3680 - accuracy: 0.8839 - val_loss: 0.9171 - val_accuracy: 0.7453
Epoch 165/200
266/266 [==============================] - 8s 30ms/step - loss: 0.3616 - accuracy: 0.8879 - val_loss: 0.9157 - val_accuracy: 0.7467
Epoch 166/200
266/266 [==============================] - 8s 30ms/step - loss: 0.3803 - accuracy: 0.8783 - val_loss: 0.9178 - val_accuracy: 0.7440
Epoch 167/200
266/266 [==============================] - 8s 30ms/step - loss: 0.3648 - accuracy: 0.8871 - val_loss: 0.9232 - val_accuracy: 0.7473
Epoch 168/200
266/266 [==============================] - 8s 30ms/step - loss: 0.3564 - accuracy: 0.8881 - val_loss: 0.9119 - val_accuracy: 0.7427
Epoch 169/200
266/266 [==============================] - 8s 30ms/step - loss: 0.3498 - accuracy: 0.8928 - val_loss: 0.9170 - val_accuracy: 0.7473
Epoch 170/200
266/266 [==============================] - 8s 31ms/step - loss: 0.3563 - accuracy: 0.8880 - val_loss: 0.9175 - val_accuracy: 0.7394
Epoch 171/200
266/266 [==============================] - 8s 30ms/step - loss: 0.3584 - accuracy: 0.8814 - val_loss: 0.9149 - val_accuracy: 0.7513
Epoch 172/200
266/266 [==============================] - 8s 30ms/step - loss: 0.3499 - accuracy: 0.8867 - val_loss: 0.9129 - val_accuracy: 0.7453
Epoch 173/200
266/266 [==============================] - 8s 31ms/step - loss: 0.3328 - accuracy: 0.8958 - val_loss: 0.9138 - val_accuracy: 0.7427
Epoch 174/200
266/266 [==============================] - 8s 30ms/step - loss: 0.3365 - accuracy: 0.8946 - val_loss: 0.9241 - val_accuracy: 0.7487
In [ ]:
_, accuracy = model_report(DENSENET_MODEL_OPTIMIZED, DENSENET_MODEL_OPTIMIZED_history)
accuracies_opt_SGD["DENSENET_ALL"] = accuracy
Test set evaluation metrics
---------------------------
Loss:     0.906
Accuracy: 74.554%

RMSprop

Δίκτυα "from scratch"

In [ ]:
accuracies_opt_RMSprop = {}
Simple CNN
In [ ]:
SIMPLE_MODEL_OPTIMIZED = init_simple_model_optimized(summary = True, optimizer = tf.optimizers.RMSprop)
SIMPLE_MODEL_OPTIMIZED_history = train_model(SIMPLE_MODEL_OPTIMIZED, epochs = 200, callbacks=[callback])
Model: "sequential"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
conv2d (Conv2D)              (None, 30, 30, 32)        896       
_________________________________________________________________
batch_normalization (BatchNo (None, 30, 30, 32)        128       
_________________________________________________________________
re_lu (ReLU)                 (None, 30, 30, 32)        0         
_________________________________________________________________
max_pooling2d (MaxPooling2D) (None, 15, 15, 32)        0         
_________________________________________________________________
dropout (Dropout)            (None, 15, 15, 32)        0         
_________________________________________________________________
conv2d_1 (Conv2D)            (None, 13, 13, 64)        18496     
_________________________________________________________________
batch_normalization_1 (Batch (None, 13, 13, 64)        256       
_________________________________________________________________
re_lu_1 (ReLU)               (None, 13, 13, 64)        0         
_________________________________________________________________
max_pooling2d_1 (MaxPooling2 (None, 6, 6, 64)          0         
_________________________________________________________________
dropout_1 (Dropout)          (None, 6, 6, 64)          0         
_________________________________________________________________
conv2d_2 (Conv2D)            (None, 4, 4, 64)          36928     
_________________________________________________________________
batch_normalization_2 (Batch (None, 4, 4, 64)          256       
_________________________________________________________________
re_lu_2 (ReLU)               (None, 4, 4, 64)          0         
_________________________________________________________________
flatten (Flatten)            (None, 1024)              0         
_________________________________________________________________
dropout_2 (Dropout)          (None, 1024)              0         
_________________________________________________________________
dense (Dense)                (None, 64)                65600     
_________________________________________________________________
dense_1 (Dense)              (None, 20)                1300      
=================================================================
Total params: 123,860
Trainable params: 123,540
Non-trainable params: 320
_________________________________________________________________
Epoch 1/200
266/266 [==============================] - 9s 7ms/step - loss: 4.2633 - accuracy: 0.0620 - val_loss: 4.1571 - val_accuracy: 0.0898
Epoch 2/200
266/266 [==============================] - 2s 6ms/step - loss: 3.8029 - accuracy: 0.1761 - val_loss: 3.7418 - val_accuracy: 0.1875
Epoch 3/200
266/266 [==============================] - 1s 5ms/step - loss: 3.5476 - accuracy: 0.2293 - val_loss: 3.4099 - val_accuracy: 0.2520
Epoch 4/200
266/266 [==============================] - 1s 5ms/step - loss: 3.3748 - accuracy: 0.2660 - val_loss: 3.1844 - val_accuracy: 0.3025
Epoch 5/200
266/266 [==============================] - 1s 5ms/step - loss: 3.1896 - accuracy: 0.2988 - val_loss: 3.0279 - val_accuracy: 0.3305
Epoch 6/200
266/266 [==============================] - 1s 6ms/step - loss: 3.0345 - accuracy: 0.3382 - val_loss: 2.8616 - val_accuracy: 0.3684
Epoch 7/200
266/266 [==============================] - 1s 5ms/step - loss: 2.9136 - accuracy: 0.3597 - val_loss: 2.7998 - val_accuracy: 0.3803
Epoch 8/200
266/266 [==============================] - 1s 5ms/step - loss: 2.7932 - accuracy: 0.3885 - val_loss: 2.6588 - val_accuracy: 0.4149
Epoch 9/200
266/266 [==============================] - 1s 5ms/step - loss: 2.6684 - accuracy: 0.4082 - val_loss: 2.5391 - val_accuracy: 0.4362
Epoch 10/200
266/266 [==============================] - 1s 5ms/step - loss: 2.5965 - accuracy: 0.4245 - val_loss: 2.5579 - val_accuracy: 0.4255
Epoch 11/200
266/266 [==============================] - 1s 5ms/step - loss: 2.5066 - accuracy: 0.4401 - val_loss: 2.4786 - val_accuracy: 0.4395
Epoch 12/200
266/266 [==============================] - 1s 5ms/step - loss: 2.4419 - accuracy: 0.4466 - val_loss: 2.4126 - val_accuracy: 0.4661
Epoch 13/200
266/266 [==============================] - 1s 5ms/step - loss: 2.3344 - accuracy: 0.4698 - val_loss: 2.3197 - val_accuracy: 0.4721
Epoch 14/200
266/266 [==============================] - 2s 6ms/step - loss: 2.2919 - accuracy: 0.4696 - val_loss: 2.4086 - val_accuracy: 0.4568
Epoch 15/200
266/266 [==============================] - 1s 5ms/step - loss: 2.1857 - accuracy: 0.4990 - val_loss: 2.4111 - val_accuracy: 0.4422
Epoch 16/200
266/266 [==============================] - 1s 5ms/step - loss: 2.1608 - accuracy: 0.5040 - val_loss: 2.4066 - val_accuracy: 0.4475
Epoch 17/200
266/266 [==============================] - 1s 5ms/step - loss: 2.1175 - accuracy: 0.5131 - val_loss: 2.0808 - val_accuracy: 0.5253
Epoch 18/200
266/266 [==============================] - 1s 5ms/step - loss: 2.0772 - accuracy: 0.5091 - val_loss: 2.2735 - val_accuracy: 0.4774
Epoch 19/200
266/266 [==============================] - 2s 6ms/step - loss: 2.0044 - accuracy: 0.5344 - val_loss: 2.0152 - val_accuracy: 0.5293
Epoch 20/200
266/266 [==============================] - 2s 6ms/step - loss: 1.9969 - accuracy: 0.5301 - val_loss: 2.2163 - val_accuracy: 0.4860
Epoch 21/200
266/266 [==============================] - 1s 5ms/step - loss: 1.9102 - accuracy: 0.5481 - val_loss: 1.9764 - val_accuracy: 0.5372
Epoch 22/200
266/266 [==============================] - 1s 5ms/step - loss: 1.8999 - accuracy: 0.5521 - val_loss: 2.0552 - val_accuracy: 0.5193
Epoch 23/200
266/266 [==============================] - 1s 5ms/step - loss: 1.8392 - accuracy: 0.5658 - val_loss: 2.1058 - val_accuracy: 0.5053
Epoch 24/200
266/266 [==============================] - 1s 5ms/step - loss: 1.8016 - accuracy: 0.5678 - val_loss: 2.0383 - val_accuracy: 0.5199
Epoch 25/200
266/266 [==============================] - 1s 5ms/step - loss: 1.7625 - accuracy: 0.5810 - val_loss: 1.9713 - val_accuracy: 0.5239
Epoch 26/200
266/266 [==============================] - 1s 5ms/step - loss: 1.7317 - accuracy: 0.5770 - val_loss: 1.8440 - val_accuracy: 0.5638
Epoch 27/200
266/266 [==============================] - 1s 5ms/step - loss: 1.7153 - accuracy: 0.5872 - val_loss: 1.9434 - val_accuracy: 0.5326
Epoch 28/200
266/266 [==============================] - 1s 5ms/step - loss: 1.6715 - accuracy: 0.5910 - val_loss: 1.9007 - val_accuracy: 0.5346
Epoch 29/200
266/266 [==============================] - 1s 5ms/step - loss: 1.6254 - accuracy: 0.6101 - val_loss: 1.8077 - val_accuracy: 0.5625
Epoch 30/200
266/266 [==============================] - 1s 5ms/step - loss: 1.5977 - accuracy: 0.6135 - val_loss: 1.6923 - val_accuracy: 0.5984
Epoch 31/200
266/266 [==============================] - 2s 6ms/step - loss: 1.6302 - accuracy: 0.5968 - val_loss: 1.7776 - val_accuracy: 0.5678
Epoch 32/200
266/266 [==============================] - 2s 6ms/step - loss: 1.5716 - accuracy: 0.6095 - val_loss: 1.8088 - val_accuracy: 0.5645
Epoch 33/200
266/266 [==============================] - 1s 5ms/step - loss: 1.5217 - accuracy: 0.6244 - val_loss: 1.7154 - val_accuracy: 0.5785
Epoch 34/200
266/266 [==============================] - 1s 5ms/step - loss: 1.5025 - accuracy: 0.6304 - val_loss: 1.7773 - val_accuracy: 0.5592
Epoch 35/200
266/266 [==============================] - 1s 5ms/step - loss: 1.4976 - accuracy: 0.6213 - val_loss: 1.6617 - val_accuracy: 0.5911
Epoch 36/200
266/266 [==============================] - 1s 5ms/step - loss: 1.4691 - accuracy: 0.6351 - val_loss: 1.8598 - val_accuracy: 0.5479
Epoch 37/200
266/266 [==============================] - 1s 5ms/step - loss: 1.4341 - accuracy: 0.6446 - val_loss: 1.7716 - val_accuracy: 0.5632
Epoch 38/200
266/266 [==============================] - 1s 5ms/step - loss: 1.4149 - accuracy: 0.6491 - val_loss: 1.7063 - val_accuracy: 0.5858
Epoch 39/200
266/266 [==============================] - 1s 5ms/step - loss: 1.4169 - accuracy: 0.6487 - val_loss: 1.6135 - val_accuracy: 0.6037
Epoch 40/200
266/266 [==============================] - 1s 5ms/step - loss: 1.4103 - accuracy: 0.6460 - val_loss: 1.5607 - val_accuracy: 0.6130
Epoch 41/200
266/266 [==============================] - 1s 5ms/step - loss: 1.3597 - accuracy: 0.6626 - val_loss: 1.5935 - val_accuracy: 0.6004
Epoch 42/200
266/266 [==============================] - 1s 5ms/step - loss: 1.3419 - accuracy: 0.6573 - val_loss: 1.6378 - val_accuracy: 0.5984
Epoch 43/200
266/266 [==============================] - 1s 5ms/step - loss: 1.3403 - accuracy: 0.6647 - val_loss: 1.5001 - val_accuracy: 0.6230
Epoch 44/200
266/266 [==============================] - 1s 5ms/step - loss: 1.3201 - accuracy: 0.6615 - val_loss: 1.6570 - val_accuracy: 0.5838
Epoch 45/200
266/266 [==============================] - 1s 5ms/step - loss: 1.2959 - accuracy: 0.6772 - val_loss: 2.0763 - val_accuracy: 0.4914
Epoch 46/200
266/266 [==============================] - 1s 5ms/step - loss: 1.2707 - accuracy: 0.6833 - val_loss: 1.5081 - val_accuracy: 0.6270
Epoch 47/200
266/266 [==============================] - 1s 5ms/step - loss: 1.2833 - accuracy: 0.6684 - val_loss: 1.5070 - val_accuracy: 0.6283
Epoch 48/200
266/266 [==============================] - 1s 6ms/step - loss: 1.2585 - accuracy: 0.6892 - val_loss: 1.6296 - val_accuracy: 0.5964
Epoch 49/200
266/266 [==============================] - 2s 6ms/step - loss: 1.2272 - accuracy: 0.6863 - val_loss: 1.4268 - val_accuracy: 0.6423
Epoch 50/200
266/266 [==============================] - 1s 5ms/step - loss: 1.2148 - accuracy: 0.6842 - val_loss: 1.4972 - val_accuracy: 0.6237
Epoch 51/200
266/266 [==============================] - 2s 6ms/step - loss: 1.1706 - accuracy: 0.7027 - val_loss: 1.6496 - val_accuracy: 0.5811
Epoch 52/200
266/266 [==============================] - 1s 5ms/step - loss: 1.1826 - accuracy: 0.6911 - val_loss: 1.6186 - val_accuracy: 0.5871
Epoch 53/200
266/266 [==============================] - 2s 6ms/step - loss: 1.1797 - accuracy: 0.6988 - val_loss: 1.3935 - val_accuracy: 0.6509
Epoch 54/200
266/266 [==============================] - 2s 6ms/step - loss: 1.1693 - accuracy: 0.7072 - val_loss: 1.5528 - val_accuracy: 0.6004
Epoch 55/200
266/266 [==============================] - 1s 5ms/step - loss: 1.1344 - accuracy: 0.6998 - val_loss: 1.5128 - val_accuracy: 0.6170
Epoch 56/200
266/266 [==============================] - 1s 5ms/step - loss: 1.1341 - accuracy: 0.7026 - val_loss: 1.5323 - val_accuracy: 0.6223
Epoch 57/200
266/266 [==============================] - 2s 6ms/step - loss: 1.0784 - accuracy: 0.7273 - val_loss: 1.5608 - val_accuracy: 0.6070
Epoch 58/200
266/266 [==============================] - 1s 5ms/step - loss: 1.1169 - accuracy: 0.7191 - val_loss: 1.4420 - val_accuracy: 0.6390
Epoch 59/200
266/266 [==============================] - 1s 5ms/step - loss: 1.1029 - accuracy: 0.7106 - val_loss: 1.3917 - val_accuracy: 0.6483
Epoch 60/200
266/266 [==============================] - 1s 5ms/step - loss: 1.0737 - accuracy: 0.7255 - val_loss: 1.5038 - val_accuracy: 0.6210
Epoch 61/200
266/266 [==============================] - 2s 6ms/step - loss: 1.0539 - accuracy: 0.7229 - val_loss: 1.3907 - val_accuracy: 0.6483
Epoch 62/200
266/266 [==============================] - 2s 6ms/step - loss: 1.0393 - accuracy: 0.7294 - val_loss: 1.4487 - val_accuracy: 0.6396
Epoch 63/200
266/266 [==============================] - 1s 5ms/step - loss: 1.0402 - accuracy: 0.7257 - val_loss: 1.3677 - val_accuracy: 0.6602
Epoch 64/200
266/266 [==============================] - 1s 5ms/step - loss: 1.0486 - accuracy: 0.7244 - val_loss: 1.4437 - val_accuracy: 0.6343
Epoch 65/200
266/266 [==============================] - 1s 5ms/step - loss: 1.0325 - accuracy: 0.7295 - val_loss: 1.3374 - val_accuracy: 0.6556
Epoch 66/200
266/266 [==============================] - 1s 6ms/step - loss: 1.0333 - accuracy: 0.7358 - val_loss: 1.3259 - val_accuracy: 0.6582
Epoch 67/200
266/266 [==============================] - 1s 6ms/step - loss: 1.0013 - accuracy: 0.7406 - val_loss: 1.3559 - val_accuracy: 0.6483
Epoch 68/200
266/266 [==============================] - 1s 5ms/step - loss: 0.9992 - accuracy: 0.7392 - val_loss: 1.4178 - val_accuracy: 0.6343
Epoch 69/200
266/266 [==============================] - 2s 6ms/step - loss: 1.0100 - accuracy: 0.7359 - val_loss: 1.5855 - val_accuracy: 0.6124
Epoch 70/200
266/266 [==============================] - 1s 5ms/step - loss: 1.0003 - accuracy: 0.7322 - val_loss: 1.3326 - val_accuracy: 0.6509
Epoch 71/200
266/266 [==============================] - 2s 6ms/step - loss: 0.9916 - accuracy: 0.7400 - val_loss: 1.4598 - val_accuracy: 0.6270
Epoch 72/200
266/266 [==============================] - 1s 5ms/step - loss: 0.9990 - accuracy: 0.7439 - val_loss: 1.3851 - val_accuracy: 0.6449
Epoch 73/200
266/266 [==============================] - 1s 5ms/step - loss: 0.9600 - accuracy: 0.7435 - val_loss: 1.3739 - val_accuracy: 0.6469
Epoch 74/200
266/266 [==============================] - 1s 5ms/step - loss: 0.9662 - accuracy: 0.7465 - val_loss: 1.3282 - val_accuracy: 0.6616
Epoch 75/200
266/266 [==============================] - 1s 6ms/step - loss: 0.9374 - accuracy: 0.7507 - val_loss: 1.4503 - val_accuracy: 0.6403
Epoch 76/200
266/266 [==============================] - 1s 5ms/step - loss: 0.9347 - accuracy: 0.7567 - val_loss: 1.2933 - val_accuracy: 0.6762
Epoch 77/200
266/266 [==============================] - 1s 5ms/step - loss: 0.9263 - accuracy: 0.7628 - val_loss: 1.3495 - val_accuracy: 0.6443
Epoch 78/200
266/266 [==============================] - 2s 6ms/step - loss: 0.9212 - accuracy: 0.7461 - val_loss: 1.3362 - val_accuracy: 0.6569
Epoch 79/200
266/266 [==============================] - 1s 5ms/step - loss: 0.9073 - accuracy: 0.7587 - val_loss: 1.4438 - val_accuracy: 0.6356
Epoch 80/200
266/266 [==============================] - 2s 6ms/step - loss: 0.9279 - accuracy: 0.7499 - val_loss: 1.4578 - val_accuracy: 0.6230
Epoch 81/200
266/266 [==============================] - 1s 5ms/step - loss: 0.8807 - accuracy: 0.7732 - val_loss: 1.3694 - val_accuracy: 0.6529
Epoch 82/200
266/266 [==============================] - 1s 5ms/step - loss: 0.9035 - accuracy: 0.7672 - val_loss: 1.5011 - val_accuracy: 0.6210
Epoch 83/200
266/266 [==============================] - 2s 6ms/step - loss: 0.8646 - accuracy: 0.7779 - val_loss: 1.6076 - val_accuracy: 0.5991
Epoch 84/200
266/266 [==============================] - 1s 6ms/step - loss: 0.8885 - accuracy: 0.7615 - val_loss: 1.3762 - val_accuracy: 0.6496
Epoch 85/200
266/266 [==============================] - 1s 5ms/step - loss: 0.8676 - accuracy: 0.7696 - val_loss: 1.3264 - val_accuracy: 0.6582
Epoch 86/200
266/266 [==============================] - 2s 6ms/step - loss: 0.8471 - accuracy: 0.7769 - val_loss: 1.2863 - val_accuracy: 0.6729
Epoch 87/200
266/266 [==============================] - 1s 5ms/step - loss: 0.8691 - accuracy: 0.7714 - val_loss: 1.3709 - val_accuracy: 0.6556
Epoch 88/200
266/266 [==============================] - 1s 5ms/step - loss: 0.8595 - accuracy: 0.7713 - val_loss: 1.3171 - val_accuracy: 0.6682
Epoch 89/200
266/266 [==============================] - 2s 6ms/step - loss: 0.8616 - accuracy: 0.7710 - val_loss: 1.3845 - val_accuracy: 0.6476
Epoch 90/200
266/266 [==============================] - 2s 6ms/step - loss: 0.8302 - accuracy: 0.7769 - val_loss: 1.4264 - val_accuracy: 0.6323
Epoch 91/200
266/266 [==============================] - 2s 6ms/step - loss: 0.8361 - accuracy: 0.7745 - val_loss: 1.2609 - val_accuracy: 0.6749
Epoch 92/200
266/266 [==============================] - 1s 5ms/step - loss: 0.8047 - accuracy: 0.7864 - val_loss: 1.3307 - val_accuracy: 0.6543
Epoch 93/200
266/266 [==============================] - 1s 5ms/step - loss: 0.8153 - accuracy: 0.7778 - val_loss: 1.4179 - val_accuracy: 0.6343
Epoch 94/200
266/266 [==============================] - 1s 6ms/step - loss: 0.8200 - accuracy: 0.7774 - val_loss: 1.2534 - val_accuracy: 0.6789
Epoch 95/200
266/266 [==============================] - 1s 5ms/step - loss: 0.8193 - accuracy: 0.7798 - val_loss: 1.2924 - val_accuracy: 0.6642
Epoch 96/200
266/266 [==============================] - 2s 6ms/step - loss: 0.8153 - accuracy: 0.7842 - val_loss: 1.3509 - val_accuracy: 0.6516
Epoch 97/200
266/266 [==============================] - 1s 5ms/step - loss: 0.8022 - accuracy: 0.7887 - val_loss: 1.2518 - val_accuracy: 0.6762
Epoch 98/200
266/266 [==============================] - 1s 5ms/step - loss: 0.7772 - accuracy: 0.7918 - val_loss: 1.3956 - val_accuracy: 0.6449
Epoch 99/200
266/266 [==============================] - 1s 5ms/step - loss: 0.7930 - accuracy: 0.7828 - val_loss: 1.2982 - val_accuracy: 0.6616
Epoch 100/200
266/266 [==============================] - 1s 5ms/step - loss: 0.7995 - accuracy: 0.7855 - val_loss: 1.2339 - val_accuracy: 0.6822
Epoch 101/200
266/266 [==============================] - 2s 6ms/step - loss: 0.7738 - accuracy: 0.7988 - val_loss: 1.2994 - val_accuracy: 0.6815
Epoch 102/200
266/266 [==============================] - 2s 6ms/step - loss: 0.7547 - accuracy: 0.7924 - val_loss: 1.3202 - val_accuracy: 0.6589
Epoch 103/200
266/266 [==============================] - 1s 5ms/step - loss: 0.7690 - accuracy: 0.7929 - val_loss: 1.4071 - val_accuracy: 0.6503
Epoch 104/200
266/266 [==============================] - 1s 5ms/step - loss: 0.7517 - accuracy: 0.7998 - val_loss: 1.3579 - val_accuracy: 0.6576
Epoch 105/200
266/266 [==============================] - 2s 6ms/step - loss: 0.7603 - accuracy: 0.7949 - val_loss: 1.2382 - val_accuracy: 0.6888
Epoch 106/200
266/266 [==============================] - 2s 6ms/step - loss: 0.7419 - accuracy: 0.8042 - val_loss: 1.4826 - val_accuracy: 0.6263
Epoch 107/200
266/266 [==============================] - 1s 5ms/step - loss: 0.7454 - accuracy: 0.8043 - val_loss: 1.3022 - val_accuracy: 0.6656
Epoch 108/200
266/266 [==============================] - 1s 5ms/step - loss: 0.7345 - accuracy: 0.8093 - val_loss: 1.2772 - val_accuracy: 0.6742
Epoch 109/200
266/266 [==============================] - 1s 6ms/step - loss: 0.7453 - accuracy: 0.7996 - val_loss: 1.2896 - val_accuracy: 0.6789
Epoch 110/200
266/266 [==============================] - 2s 6ms/step - loss: 0.7301 - accuracy: 0.8068 - val_loss: 1.3189 - val_accuracy: 0.6735
Epoch 111/200
266/266 [==============================] - 1s 5ms/step - loss: 0.7326 - accuracy: 0.8081 - val_loss: 1.2562 - val_accuracy: 0.6935
Epoch 112/200
266/266 [==============================] - 1s 5ms/step - loss: 0.7306 - accuracy: 0.8143 - val_loss: 1.4672 - val_accuracy: 0.6516
Epoch 113/200
266/266 [==============================] - 2s 6ms/step - loss: 0.7230 - accuracy: 0.8012 - val_loss: 1.3383 - val_accuracy: 0.6755
Epoch 114/200
266/266 [==============================] - 2s 6ms/step - loss: 0.7185 - accuracy: 0.8064 - val_loss: 1.2280 - val_accuracy: 0.6922
Epoch 115/200
266/266 [==============================] - 2s 6ms/step - loss: 0.7091 - accuracy: 0.8127 - val_loss: 1.3936 - val_accuracy: 0.6562
Epoch 116/200
266/266 [==============================] - 1s 5ms/step - loss: 0.7194 - accuracy: 0.8086 - val_loss: 1.3513 - val_accuracy: 0.6676
Epoch 117/200
266/266 [==============================] - 1s 5ms/step - loss: 0.6829 - accuracy: 0.8170 - val_loss: 1.2652 - val_accuracy: 0.6749
Epoch 118/200
266/266 [==============================] - 2s 6ms/step - loss: 0.7028 - accuracy: 0.8090 - val_loss: 1.2539 - val_accuracy: 0.6802
Epoch 119/200
266/266 [==============================] - 1s 5ms/step - loss: 0.7076 - accuracy: 0.8099 - val_loss: 1.4300 - val_accuracy: 0.6503
Epoch 120/200
266/266 [==============================] - 1s 5ms/step - loss: 0.6941 - accuracy: 0.8209 - val_loss: 1.2805 - val_accuracy: 0.6828
Epoch 121/200
266/266 [==============================] - 1s 5ms/step - loss: 0.6845 - accuracy: 0.8217 - val_loss: 1.3893 - val_accuracy: 0.6636
Epoch 122/200
266/266 [==============================] - 1s 5ms/step - loss: 0.6770 - accuracy: 0.8153 - val_loss: 1.2752 - val_accuracy: 0.6935
Epoch 123/200
266/266 [==============================] - 2s 6ms/step - loss: 0.6903 - accuracy: 0.8157 - val_loss: 1.3328 - val_accuracy: 0.6656
Epoch 124/200
266/266 [==============================] - 1s 5ms/step - loss: 0.6696 - accuracy: 0.8224 - val_loss: 1.2570 - val_accuracy: 0.6822
Epoch 125/200
266/266 [==============================] - 1s 5ms/step - loss: 0.6372 - accuracy: 0.8324 - val_loss: 1.3225 - val_accuracy: 0.6722
Epoch 126/200
266/266 [==============================] - 1s 6ms/step - loss: 0.6696 - accuracy: 0.8209 - val_loss: 1.2766 - val_accuracy: 0.6802
Epoch 127/200
266/266 [==============================] - 1s 5ms/step - loss: 0.6768 - accuracy: 0.8217 - val_loss: 1.3543 - val_accuracy: 0.6789
Epoch 128/200
266/266 [==============================] - 1s 5ms/step - loss: 0.6841 - accuracy: 0.8208 - val_loss: 1.3462 - val_accuracy: 0.6602
Epoch 129/200
266/266 [==============================] - 2s 6ms/step - loss: 0.6723 - accuracy: 0.8159 - val_loss: 1.2947 - val_accuracy: 0.6676
Epoch 130/200
266/266 [==============================] - 1s 6ms/step - loss: 0.6954 - accuracy: 0.8134 - val_loss: 1.2961 - val_accuracy: 0.6762
Epoch 131/200
266/266 [==============================] - 1s 5ms/step - loss: 0.6512 - accuracy: 0.8279 - val_loss: 1.3188 - val_accuracy: 0.6762
Epoch 132/200
266/266 [==============================] - 2s 6ms/step - loss: 0.6875 - accuracy: 0.8176 - val_loss: 1.2988 - val_accuracy: 0.6755
Epoch 133/200
266/266 [==============================] - 1s 6ms/step - loss: 0.6548 - accuracy: 0.8218 - val_loss: 1.2586 - val_accuracy: 0.6882
Epoch 134/200
266/266 [==============================] - 1s 5ms/step - loss: 0.6329 - accuracy: 0.8362 - val_loss: 1.3722 - val_accuracy: 0.6576
In [ ]:
_, accuracy = model_report(SIMPLE_MODEL_OPTIMIZED, SIMPLE_MODEL_OPTIMIZED_history)
accuracies_opt_RMSprop["SIMPLE_MODEL"] = accuracy
Test set evaluation metrics
---------------------------
Loss:     1.326
Accuracy: 67.262%
CNN1
In [ ]:
CNN1_MODEL_OPTIMIZED = init_cnn1_model_optimized(summary = True, optimizer = tf.optimizers.RMSprop)
CNN1_MODEL_OPTIMIZED_history = train_model(CNN1_MODEL_OPTIMIZED, epochs = 200, callbacks=[callback])
Model: "sequential_1"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
conv2d_3 (Conv2D)            (None, 30, 30, 32)        896       
_________________________________________________________________
batch_normalization_3 (Batch (None, 30, 30, 32)        128       
_________________________________________________________________
re_lu_3 (ReLU)               (None, 30, 30, 32)        0         
_________________________________________________________________
max_pooling2d_2 (MaxPooling2 (None, 15, 15, 32)        0         
_________________________________________________________________
dropout_3 (Dropout)          (None, 15, 15, 32)        0         
_________________________________________________________________
conv2d_4 (Conv2D)            (None, 13, 13, 64)        18496     
_________________________________________________________________
batch_normalization_4 (Batch (None, 13, 13, 64)        256       
_________________________________________________________________
re_lu_4 (ReLU)               (None, 13, 13, 64)        0         
_________________________________________________________________
max_pooling2d_3 (MaxPooling2 (None, 6, 6, 64)          0         
_________________________________________________________________
dropout_4 (Dropout)          (None, 6, 6, 64)          0         
_________________________________________________________________
conv2d_5 (Conv2D)            (None, 4, 4, 128)         73856     
_________________________________________________________________
batch_normalization_5 (Batch (None, 4, 4, 128)         512       
_________________________________________________________________
re_lu_5 (ReLU)               (None, 4, 4, 128)         0         
_________________________________________________________________
average_pooling2d (AveragePo (None, 2, 2, 128)         0         
_________________________________________________________________
dropout_5 (Dropout)          (None, 2, 2, 128)         0         
_________________________________________________________________
flatten_1 (Flatten)          (None, 512)               0         
_________________________________________________________________
dense_2 (Dense)              (None, 1024)              525312    
_________________________________________________________________
dropout_6 (Dropout)          (None, 1024)              0         
_________________________________________________________________
dense_3 (Dense)              (None, 20)                20500     
=================================================================
Total params: 639,956
Trainable params: 639,508
Non-trainable params: 448
_________________________________________________________________
Epoch 1/200
266/266 [==============================] - 3s 6ms/step - loss: 4.1480 - accuracy: 0.1402 - val_loss: 4.6489 - val_accuracy: 0.0745
Epoch 2/200
266/266 [==============================] - 1s 5ms/step - loss: 3.5846 - accuracy: 0.2689 - val_loss: 3.8664 - val_accuracy: 0.1802
Epoch 3/200
266/266 [==============================] - 2s 6ms/step - loss: 3.2813 - accuracy: 0.3249 - val_loss: 3.1695 - val_accuracy: 0.3504
Epoch 4/200
266/266 [==============================] - 2s 6ms/step - loss: 3.0707 - accuracy: 0.3663 - val_loss: 3.0451 - val_accuracy: 0.3590
Epoch 5/200
266/266 [==============================] - 1s 5ms/step - loss: 2.8993 - accuracy: 0.3878 - val_loss: 2.8041 - val_accuracy: 0.4096
Epoch 6/200
266/266 [==============================] - 1s 5ms/step - loss: 2.7343 - accuracy: 0.4275 - val_loss: 2.9218 - val_accuracy: 0.3557
Epoch 7/200
266/266 [==============================] - 2s 6ms/step - loss: 2.6719 - accuracy: 0.4306 - val_loss: 2.9496 - val_accuracy: 0.3484
Epoch 8/200
266/266 [==============================] - 1s 5ms/step - loss: 2.5173 - accuracy: 0.4529 - val_loss: 2.7574 - val_accuracy: 0.3730
Epoch 9/200
266/266 [==============================] - 1s 5ms/step - loss: 2.4058 - accuracy: 0.4834 - val_loss: 2.6333 - val_accuracy: 0.4176
Epoch 10/200
266/266 [==============================] - 1s 5ms/step - loss: 2.3222 - accuracy: 0.4919 - val_loss: 2.5442 - val_accuracy: 0.4355
Epoch 11/200
266/266 [==============================] - 1s 5ms/step - loss: 2.2332 - accuracy: 0.5111 - val_loss: 2.5689 - val_accuracy: 0.4269
Epoch 12/200
266/266 [==============================] - 1s 5ms/step - loss: 2.1774 - accuracy: 0.5190 - val_loss: 2.4010 - val_accuracy: 0.4601
Epoch 13/200
266/266 [==============================] - 2s 6ms/step - loss: 2.1117 - accuracy: 0.5239 - val_loss: 2.5912 - val_accuracy: 0.4382
Epoch 14/200
266/266 [==============================] - 1s 6ms/step - loss: 2.0572 - accuracy: 0.5393 - val_loss: 2.4011 - val_accuracy: 0.4461
Epoch 15/200
266/266 [==============================] - 2s 6ms/step - loss: 1.9954 - accuracy: 0.5506 - val_loss: 2.3849 - val_accuracy: 0.4574
Epoch 16/200
266/266 [==============================] - 2s 6ms/step - loss: 1.9434 - accuracy: 0.5543 - val_loss: 2.1304 - val_accuracy: 0.5153
Epoch 17/200
266/266 [==============================] - 2s 6ms/step - loss: 1.8848 - accuracy: 0.5699 - val_loss: 2.3832 - val_accuracy: 0.4621
Epoch 18/200
266/266 [==============================] - 1s 5ms/step - loss: 1.8549 - accuracy: 0.5716 - val_loss: 2.2085 - val_accuracy: 0.4914
Epoch 19/200
266/266 [==============================] - 1s 5ms/step - loss: 1.8207 - accuracy: 0.5846 - val_loss: 2.1662 - val_accuracy: 0.5000
Epoch 20/200
266/266 [==============================] - 1s 5ms/step - loss: 1.7812 - accuracy: 0.5908 - val_loss: 2.3495 - val_accuracy: 0.4641
Epoch 21/200
266/266 [==============================] - 1s 5ms/step - loss: 1.7221 - accuracy: 0.6004 - val_loss: 2.0187 - val_accuracy: 0.5306
Epoch 22/200
266/266 [==============================] - 1s 5ms/step - loss: 1.6866 - accuracy: 0.6014 - val_loss: 2.5474 - val_accuracy: 0.4275
Epoch 23/200
266/266 [==============================] - 1s 5ms/step - loss: 1.6299 - accuracy: 0.6224 - val_loss: 2.2550 - val_accuracy: 0.4761
Epoch 24/200
266/266 [==============================] - 1s 5ms/step - loss: 1.6270 - accuracy: 0.6049 - val_loss: 2.0899 - val_accuracy: 0.5027
Epoch 25/200
266/266 [==============================] - 1s 5ms/step - loss: 1.5899 - accuracy: 0.6239 - val_loss: 1.9089 - val_accuracy: 0.5559
Epoch 26/200
266/266 [==============================] - 2s 6ms/step - loss: 1.5415 - accuracy: 0.6301 - val_loss: 1.9491 - val_accuracy: 0.5326
Epoch 27/200
266/266 [==============================] - 2s 6ms/step - loss: 1.5335 - accuracy: 0.6233 - val_loss: 1.8047 - val_accuracy: 0.5705
Epoch 28/200
266/266 [==============================] - 1s 5ms/step - loss: 1.5116 - accuracy: 0.6392 - val_loss: 1.6740 - val_accuracy: 0.5997
Epoch 29/200
266/266 [==============================] - 2s 6ms/step - loss: 1.4642 - accuracy: 0.6466 - val_loss: 1.8546 - val_accuracy: 0.5592
Epoch 30/200
266/266 [==============================] - 2s 6ms/step - loss: 1.4366 - accuracy: 0.6485 - val_loss: 1.9076 - val_accuracy: 0.5545
Epoch 31/200
266/266 [==============================] - 2s 6ms/step - loss: 1.4130 - accuracy: 0.6569 - val_loss: 1.7144 - val_accuracy: 0.5864
Epoch 32/200
266/266 [==============================] - 1s 5ms/step - loss: 1.3779 - accuracy: 0.6688 - val_loss: 2.0637 - val_accuracy: 0.5120
Epoch 33/200
266/266 [==============================] - 2s 6ms/step - loss: 1.3603 - accuracy: 0.6702 - val_loss: 1.7703 - val_accuracy: 0.5758
Epoch 34/200
266/266 [==============================] - 2s 6ms/step - loss: 1.3316 - accuracy: 0.6760 - val_loss: 1.8667 - val_accuracy: 0.5499
Epoch 35/200
266/266 [==============================] - 1s 5ms/step - loss: 1.3185 - accuracy: 0.6713 - val_loss: 1.6416 - val_accuracy: 0.6017
Epoch 36/200
266/266 [==============================] - 1s 5ms/step - loss: 1.3234 - accuracy: 0.6755 - val_loss: 1.6291 - val_accuracy: 0.5944
Epoch 37/200
266/266 [==============================] - 2s 6ms/step - loss: 1.2916 - accuracy: 0.6795 - val_loss: 1.8346 - val_accuracy: 0.5618
Epoch 38/200
266/266 [==============================] - 1s 5ms/step - loss: 1.2723 - accuracy: 0.6791 - val_loss: 1.7548 - val_accuracy: 0.5791
Epoch 39/200
266/266 [==============================] - 1s 5ms/step - loss: 1.2471 - accuracy: 0.6882 - val_loss: 1.7479 - val_accuracy: 0.5791
Epoch 40/200
266/266 [==============================] - 2s 6ms/step - loss: 1.2034 - accuracy: 0.7054 - val_loss: 1.8230 - val_accuracy: 0.5711
Epoch 41/200
266/266 [==============================] - 1s 6ms/step - loss: 1.2118 - accuracy: 0.7048 - val_loss: 1.5182 - val_accuracy: 0.6363
Epoch 42/200
266/266 [==============================] - 1s 5ms/step - loss: 1.1799 - accuracy: 0.7136 - val_loss: 1.5805 - val_accuracy: 0.6203
Epoch 43/200
266/266 [==============================] - 2s 6ms/step - loss: 1.1865 - accuracy: 0.7024 - val_loss: 1.5767 - val_accuracy: 0.6237
Epoch 44/200
266/266 [==============================] - 2s 6ms/step - loss: 1.1352 - accuracy: 0.7232 - val_loss: 1.6929 - val_accuracy: 0.5831
Epoch 45/200
266/266 [==============================] - 2s 6ms/step - loss: 1.1308 - accuracy: 0.7122 - val_loss: 1.5909 - val_accuracy: 0.6170
Epoch 46/200
266/266 [==============================] - 1s 5ms/step - loss: 1.1161 - accuracy: 0.7206 - val_loss: 1.5665 - val_accuracy: 0.6250
Epoch 47/200
266/266 [==============================] - 1s 5ms/step - loss: 1.0945 - accuracy: 0.7307 - val_loss: 1.5725 - val_accuracy: 0.6190
Epoch 48/200
266/266 [==============================] - 2s 6ms/step - loss: 1.1174 - accuracy: 0.7140 - val_loss: 1.4246 - val_accuracy: 0.6516
Epoch 49/200
266/266 [==============================] - 1s 5ms/step - loss: 1.0564 - accuracy: 0.7258 - val_loss: 1.4248 - val_accuracy: 0.6569
Epoch 50/200
266/266 [==============================] - 1s 5ms/step - loss: 1.0882 - accuracy: 0.7293 - val_loss: 1.5650 - val_accuracy: 0.6184
Epoch 51/200
266/266 [==============================] - 1s 5ms/step - loss: 1.0559 - accuracy: 0.7310 - val_loss: 1.4532 - val_accuracy: 0.6443
Epoch 52/200
266/266 [==============================] - 1s 5ms/step - loss: 1.0445 - accuracy: 0.7359 - val_loss: 1.4662 - val_accuracy: 0.6430
Epoch 53/200
266/266 [==============================] - 2s 6ms/step - loss: 1.0480 - accuracy: 0.7349 - val_loss: 1.4865 - val_accuracy: 0.6330
Epoch 54/200
266/266 [==============================] - 2s 6ms/step - loss: 1.0242 - accuracy: 0.7422 - val_loss: 1.4592 - val_accuracy: 0.6410
Epoch 55/200
266/266 [==============================] - 1s 5ms/step - loss: 0.9639 - accuracy: 0.7572 - val_loss: 1.5151 - val_accuracy: 0.6416
Epoch 56/200
266/266 [==============================] - 1s 5ms/step - loss: 1.0076 - accuracy: 0.7419 - val_loss: 1.5448 - val_accuracy: 0.6223
Epoch 57/200
266/266 [==============================] - 1s 5ms/step - loss: 0.9851 - accuracy: 0.7453 - val_loss: 1.7900 - val_accuracy: 0.5944
Epoch 58/200
266/266 [==============================] - 1s 5ms/step - loss: 0.9575 - accuracy: 0.7571 - val_loss: 1.5268 - val_accuracy: 0.6316
Epoch 59/200
266/266 [==============================] - 1s 5ms/step - loss: 0.9562 - accuracy: 0.7540 - val_loss: 1.3510 - val_accuracy: 0.6722
Epoch 60/200
266/266 [==============================] - 1s 5ms/step - loss: 0.9425 - accuracy: 0.7586 - val_loss: 1.3332 - val_accuracy: 0.6709
Epoch 61/200
266/266 [==============================] - 1s 5ms/step - loss: 0.9481 - accuracy: 0.7584 - val_loss: 1.3255 - val_accuracy: 0.6882
Epoch 62/200
266/266 [==============================] - 1s 5ms/step - loss: 0.9145 - accuracy: 0.7680 - val_loss: 1.4228 - val_accuracy: 0.6536
Epoch 63/200
266/266 [==============================] - 2s 6ms/step - loss: 0.9014 - accuracy: 0.7733 - val_loss: 1.3787 - val_accuracy: 0.6616
Epoch 64/200
266/266 [==============================] - 1s 5ms/step - loss: 0.8888 - accuracy: 0.7749 - val_loss: 1.4195 - val_accuracy: 0.6676
Epoch 65/200
266/266 [==============================] - 1s 5ms/step - loss: 0.8645 - accuracy: 0.7818 - val_loss: 1.4693 - val_accuracy: 0.6656
Epoch 66/200
266/266 [==============================] - 1s 5ms/step - loss: 0.8792 - accuracy: 0.7773 - val_loss: 1.2854 - val_accuracy: 0.6915
Epoch 67/200
266/266 [==============================] - 2s 6ms/step - loss: 0.8611 - accuracy: 0.7713 - val_loss: 1.4032 - val_accuracy: 0.6622
Epoch 68/200
266/266 [==============================] - 1s 5ms/step - loss: 0.8416 - accuracy: 0.7844 - val_loss: 1.4457 - val_accuracy: 0.6562
Epoch 69/200
266/266 [==============================] - 1s 5ms/step - loss: 0.8569 - accuracy: 0.7686 - val_loss: 1.3739 - val_accuracy: 0.6596
Epoch 70/200
266/266 [==============================] - 1s 5ms/step - loss: 0.8507 - accuracy: 0.7741 - val_loss: 1.2546 - val_accuracy: 0.6961
Epoch 71/200
266/266 [==============================] - 1s 5ms/step - loss: 0.8690 - accuracy: 0.7745 - val_loss: 1.3384 - val_accuracy: 0.6749
Epoch 72/200
266/266 [==============================] - 1s 5ms/step - loss: 0.8270 - accuracy: 0.7863 - val_loss: 1.4257 - val_accuracy: 0.6642
Epoch 73/200
266/266 [==============================] - 2s 6ms/step - loss: 0.8314 - accuracy: 0.7902 - val_loss: 1.3722 - val_accuracy: 0.6709
Epoch 74/200
266/266 [==============================] - 2s 6ms/step - loss: 0.8163 - accuracy: 0.7927 - val_loss: 1.4072 - val_accuracy: 0.6709
Epoch 75/200
266/266 [==============================] - 2s 6ms/step - loss: 0.8187 - accuracy: 0.7882 - val_loss: 1.3501 - val_accuracy: 0.6815
Epoch 76/200
266/266 [==============================] - 2s 6ms/step - loss: 0.7918 - accuracy: 0.7943 - val_loss: 1.3246 - val_accuracy: 0.6822
Epoch 77/200
266/266 [==============================] - 2s 6ms/step - loss: 0.8066 - accuracy: 0.7858 - val_loss: 1.2963 - val_accuracy: 0.6868
Epoch 78/200
266/266 [==============================] - 2s 6ms/step - loss: 0.8047 - accuracy: 0.7957 - val_loss: 1.1930 - val_accuracy: 0.7055
Epoch 79/200
266/266 [==============================] - 1s 5ms/step - loss: 0.7832 - accuracy: 0.8044 - val_loss: 1.4803 - val_accuracy: 0.6516
Epoch 80/200
266/266 [==============================] - 1s 5ms/step - loss: 0.7568 - accuracy: 0.8094 - val_loss: 1.5142 - val_accuracy: 0.6589
Epoch 81/200
266/266 [==============================] - 1s 6ms/step - loss: 0.7815 - accuracy: 0.8020 - val_loss: 1.2417 - val_accuracy: 0.7015
Epoch 82/200
266/266 [==============================] - 1s 5ms/step - loss: 0.7627 - accuracy: 0.8028 - val_loss: 1.3392 - val_accuracy: 0.6689
Epoch 83/200
266/266 [==============================] - 2s 6ms/step - loss: 0.7645 - accuracy: 0.8071 - val_loss: 1.3596 - val_accuracy: 0.6822
Epoch 84/200
266/266 [==============================] - 2s 6ms/step - loss: 0.7573 - accuracy: 0.7997 - val_loss: 1.2414 - val_accuracy: 0.7015
Epoch 85/200
266/266 [==============================] - 1s 5ms/step - loss: 0.7329 - accuracy: 0.8049 - val_loss: 1.4190 - val_accuracy: 0.6676
Epoch 86/200
266/266 [==============================] - 1s 5ms/step - loss: 0.7282 - accuracy: 0.8150 - val_loss: 1.4430 - val_accuracy: 0.6702
Epoch 87/200
266/266 [==============================] - 2s 6ms/step - loss: 0.7291 - accuracy: 0.8130 - val_loss: 1.2833 - val_accuracy: 0.6902
Epoch 88/200
266/266 [==============================] - 1s 6ms/step - loss: 0.7212 - accuracy: 0.8185 - val_loss: 1.3047 - val_accuracy: 0.6928
Epoch 89/200
266/266 [==============================] - 2s 6ms/step - loss: 0.7184 - accuracy: 0.8165 - val_loss: 1.5340 - val_accuracy: 0.6396
Epoch 90/200
266/266 [==============================] - 2s 6ms/step - loss: 0.7236 - accuracy: 0.8176 - val_loss: 1.3195 - val_accuracy: 0.6908
Epoch 91/200
266/266 [==============================] - 2s 6ms/step - loss: 0.7297 - accuracy: 0.8102 - val_loss: 1.2541 - val_accuracy: 0.7035
Epoch 92/200
266/266 [==============================] - 2s 6ms/step - loss: 0.7242 - accuracy: 0.8194 - val_loss: 1.3385 - val_accuracy: 0.6848
Epoch 93/200
266/266 [==============================] - 1s 5ms/step - loss: 0.6948 - accuracy: 0.8217 - val_loss: 1.2188 - val_accuracy: 0.7074
Epoch 94/200
266/266 [==============================] - 1s 5ms/step - loss: 0.6716 - accuracy: 0.8293 - val_loss: 1.5115 - val_accuracy: 0.6556
Epoch 95/200
266/266 [==============================] - 1s 5ms/step - loss: 0.6963 - accuracy: 0.8189 - val_loss: 1.4675 - val_accuracy: 0.6562
Epoch 96/200
266/266 [==============================] - 2s 6ms/step - loss: 0.6899 - accuracy: 0.8237 - val_loss: 1.4336 - val_accuracy: 0.6762
Epoch 97/200
266/266 [==============================] - 2s 6ms/step - loss: 0.6703 - accuracy: 0.8265 - val_loss: 1.2402 - val_accuracy: 0.7041
Epoch 98/200
266/266 [==============================] - 1s 5ms/step - loss: 0.6659 - accuracy: 0.8333 - val_loss: 1.4250 - val_accuracy: 0.6795
In [ ]:
_, accuracy = model_report(CNN1_MODEL_OPTIMIZED, CNN1_MODEL_OPTIMIZED_history)
accuracies_opt_RMSprop["CNN1"] = accuracy
Test set evaluation metrics
---------------------------
Loss:     1.219
Accuracy: 68.849%
CNN2
In [ ]:
CNN2_MODEL_OPTIMIZED = init_cnn2_model_optimized(summary = True, optimizer = tf.optimizers.RMSprop)
CNN2_MODEL_OPTIMIZED_history = train_model(CNN2_MODEL_OPTIMIZED, epochs = 200, callbacks=[callback])
Model: "sequential_2"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
conv2d_6 (Conv2D)            (None, 32, 32, 32)        896       
_________________________________________________________________
batch_normalization_6 (Batch (None, 32, 32, 32)        128       
_________________________________________________________________
re_lu_6 (ReLU)               (None, 32, 32, 32)        0         
_________________________________________________________________
max_pooling2d_4 (MaxPooling2 (None, 16, 16, 32)        0         
_________________________________________________________________
dropout_7 (Dropout)          (None, 16, 16, 32)        0         
_________________________________________________________________
conv2d_7 (Conv2D)            (None, 16, 16, 64)        18496     
_________________________________________________________________
batch_normalization_7 (Batch (None, 16, 16, 64)        256       
_________________________________________________________________
re_lu_7 (ReLU)               (None, 16, 16, 64)        0         
_________________________________________________________________
max_pooling2d_5 (MaxPooling2 (None, 8, 8, 64)          0         
_________________________________________________________________
dropout_8 (Dropout)          (None, 8, 8, 64)          0         
_________________________________________________________________
conv2d_8 (Conv2D)            (None, 8, 8, 128)         73856     
_________________________________________________________________
batch_normalization_8 (Batch (None, 8, 8, 128)         512       
_________________________________________________________________
re_lu_8 (ReLU)               (None, 8, 8, 128)         0         
_________________________________________________________________
max_pooling2d_6 (MaxPooling2 (None, 4, 4, 128)         0         
_________________________________________________________________
dropout_9 (Dropout)          (None, 4, 4, 128)         0         
_________________________________________________________________
conv2d_9 (Conv2D)            (None, 4, 4, 256)         295168    
_________________________________________________________________
batch_normalization_9 (Batch (None, 4, 4, 256)         1024      
_________________________________________________________________
re_lu_9 (ReLU)               (None, 4, 4, 256)         0         
_________________________________________________________________
dropout_10 (Dropout)         (None, 4, 4, 256)         0         
_________________________________________________________________
flatten_2 (Flatten)          (None, 4096)              0         
_________________________________________________________________
dense_4 (Dense)              (None, 512)               2097664   
_________________________________________________________________
dropout_11 (Dropout)         (None, 512)               0         
_________________________________________________________________
dense_5 (Dense)              (None, 20)                10260     
=================================================================
Total params: 2,498,260
Trainable params: 2,497,300
Non-trainable params: 960
_________________________________________________________________
Epoch 1/200
266/266 [==============================] - 3s 7ms/step - loss: 5.9484 - accuracy: 0.1324 - val_loss: 6.3771 - val_accuracy: 0.0745
Epoch 2/200
266/266 [==============================] - 2s 7ms/step - loss: 5.2508 - accuracy: 0.2343 - val_loss: 5.4446 - val_accuracy: 0.1622
Epoch 3/200
266/266 [==============================] - 2s 6ms/step - loss: 4.8532 - accuracy: 0.2906 - val_loss: 4.8537 - val_accuracy: 0.2706
Epoch 4/200
266/266 [==============================] - 2s 7ms/step - loss: 4.5103 - accuracy: 0.3335 - val_loss: 4.8689 - val_accuracy: 0.2553
Epoch 5/200
266/266 [==============================] - 2s 6ms/step - loss: 4.1978 - accuracy: 0.3857 - val_loss: 4.8418 - val_accuracy: 0.2533
Epoch 6/200
266/266 [==============================] - 2s 7ms/step - loss: 3.9156 - accuracy: 0.4224 - val_loss: 4.3224 - val_accuracy: 0.3271
Epoch 7/200
266/266 [==============================] - 2s 6ms/step - loss: 3.7181 - accuracy: 0.4367 - val_loss: 4.3903 - val_accuracy: 0.2985
Epoch 8/200
266/266 [==============================] - 2s 7ms/step - loss: 3.5184 - accuracy: 0.4574 - val_loss: 4.2346 - val_accuracy: 0.3351
Epoch 9/200
266/266 [==============================] - 2s 6ms/step - loss: 3.3297 - accuracy: 0.4822 - val_loss: 3.8292 - val_accuracy: 0.3697
Epoch 10/200
266/266 [==============================] - 2s 7ms/step - loss: 3.1555 - accuracy: 0.5012 - val_loss: 4.3158 - val_accuracy: 0.3258
Epoch 11/200
266/266 [==============================] - 2s 7ms/step - loss: 3.0291 - accuracy: 0.5136 - val_loss: 4.1386 - val_accuracy: 0.3364
Epoch 12/200
266/266 [==============================] - 2s 7ms/step - loss: 2.8687 - accuracy: 0.5359 - val_loss: 4.5579 - val_accuracy: 0.2886
Epoch 13/200
266/266 [==============================] - 2s 7ms/step - loss: 2.7625 - accuracy: 0.5474 - val_loss: 3.8008 - val_accuracy: 0.3763
Epoch 14/200
266/266 [==============================] - 2s 6ms/step - loss: 2.6570 - accuracy: 0.5542 - val_loss: 4.1197 - val_accuracy: 0.3384
Epoch 15/200
266/266 [==============================] - 2s 7ms/step - loss: 2.5615 - accuracy: 0.5634 - val_loss: 3.7283 - val_accuracy: 0.3903
Epoch 16/200
266/266 [==============================] - 2s 7ms/step - loss: 2.4317 - accuracy: 0.5923 - val_loss: 3.1849 - val_accuracy: 0.4654
Epoch 17/200
266/266 [==============================] - 2s 6ms/step - loss: 2.3364 - accuracy: 0.5902 - val_loss: 3.4917 - val_accuracy: 0.4202
Epoch 18/200
266/266 [==============================] - 2s 7ms/step - loss: 2.2485 - accuracy: 0.6022 - val_loss: 4.2760 - val_accuracy: 0.3371
Epoch 19/200
266/266 [==============================] - 2s 7ms/step - loss: 2.1781 - accuracy: 0.6154 - val_loss: 2.7701 - val_accuracy: 0.5093
Epoch 20/200
266/266 [==============================] - 2s 7ms/step - loss: 2.0792 - accuracy: 0.6312 - val_loss: 2.8977 - val_accuracy: 0.4847
Epoch 21/200
266/266 [==============================] - 2s 7ms/step - loss: 2.0285 - accuracy: 0.6319 - val_loss: 2.6950 - val_accuracy: 0.5219
Epoch 22/200
266/266 [==============================] - 2s 6ms/step - loss: 1.9780 - accuracy: 0.6305 - val_loss: 2.9086 - val_accuracy: 0.4747
Epoch 23/200
266/266 [==============================] - 2s 6ms/step - loss: 1.8892 - accuracy: 0.6499 - val_loss: 2.7416 - val_accuracy: 0.4953
Epoch 24/200
266/266 [==============================] - 2s 6ms/step - loss: 1.8776 - accuracy: 0.6471 - val_loss: 2.6319 - val_accuracy: 0.5246
Epoch 25/200
266/266 [==============================] - 2s 6ms/step - loss: 1.8302 - accuracy: 0.6535 - val_loss: 2.6245 - val_accuracy: 0.5359
Epoch 26/200
266/266 [==============================] - 2s 6ms/step - loss: 1.7397 - accuracy: 0.6645 - val_loss: 2.6632 - val_accuracy: 0.5180
Epoch 27/200
266/266 [==============================] - 2s 6ms/step - loss: 1.6986 - accuracy: 0.6744 - val_loss: 2.6248 - val_accuracy: 0.4947
Epoch 28/200
266/266 [==============================] - 2s 7ms/step - loss: 1.6152 - accuracy: 0.6889 - val_loss: 2.1701 - val_accuracy: 0.5884
Epoch 29/200
266/266 [==============================] - 2s 7ms/step - loss: 1.5925 - accuracy: 0.7000 - val_loss: 2.2339 - val_accuracy: 0.5778
Epoch 30/200
266/266 [==============================] - 2s 7ms/step - loss: 1.5486 - accuracy: 0.6975 - val_loss: 2.7099 - val_accuracy: 0.5007
Epoch 31/200
266/266 [==============================] - 2s 7ms/step - loss: 1.4923 - accuracy: 0.7116 - val_loss: 2.1495 - val_accuracy: 0.5964
Epoch 32/200
266/266 [==============================] - 2s 7ms/step - loss: 1.4210 - accuracy: 0.7246 - val_loss: 2.4746 - val_accuracy: 0.5386
Epoch 33/200
266/266 [==============================] - 2s 7ms/step - loss: 1.4150 - accuracy: 0.7218 - val_loss: 1.9624 - val_accuracy: 0.6230
Epoch 34/200
266/266 [==============================] - 2s 6ms/step - loss: 1.3901 - accuracy: 0.7248 - val_loss: 2.3759 - val_accuracy: 0.5612
Epoch 35/200
266/266 [==============================] - 2s 6ms/step - loss: 1.3775 - accuracy: 0.7211 - val_loss: 1.9075 - val_accuracy: 0.6190
Epoch 36/200
266/266 [==============================] - 2s 7ms/step - loss: 1.3543 - accuracy: 0.7277 - val_loss: 2.0737 - val_accuracy: 0.6117
Epoch 37/200
266/266 [==============================] - 2s 7ms/step - loss: 1.2886 - accuracy: 0.7412 - val_loss: 2.1957 - val_accuracy: 0.5738
Epoch 38/200
266/266 [==============================] - 2s 7ms/step - loss: 1.2925 - accuracy: 0.7413 - val_loss: 2.2705 - val_accuracy: 0.5632
Epoch 39/200
266/266 [==============================] - 2s 7ms/step - loss: 1.2352 - accuracy: 0.7586 - val_loss: 1.8951 - val_accuracy: 0.6203
Epoch 40/200
266/266 [==============================] - 2s 6ms/step - loss: 1.2217 - accuracy: 0.7515 - val_loss: 2.2201 - val_accuracy: 0.5778
Epoch 41/200
266/266 [==============================] - 2s 6ms/step - loss: 1.1940 - accuracy: 0.7637 - val_loss: 2.4241 - val_accuracy: 0.5426
Epoch 42/200
266/266 [==============================] - 2s 6ms/step - loss: 1.1706 - accuracy: 0.7634 - val_loss: 2.3660 - val_accuracy: 0.5532
Epoch 43/200
266/266 [==============================] - 2s 6ms/step - loss: 1.1599 - accuracy: 0.7623 - val_loss: 2.2372 - val_accuracy: 0.5758
Epoch 44/200
266/266 [==============================] - 2s 6ms/step - loss: 1.1402 - accuracy: 0.7707 - val_loss: 1.7895 - val_accuracy: 0.6509
Epoch 45/200
266/266 [==============================] - 2s 7ms/step - loss: 1.1011 - accuracy: 0.7837 - val_loss: 1.9187 - val_accuracy: 0.6283
Epoch 46/200
266/266 [==============================] - 2s 7ms/step - loss: 1.0757 - accuracy: 0.7832 - val_loss: 1.6998 - val_accuracy: 0.6489
Epoch 47/200
266/266 [==============================] - 2s 7ms/step - loss: 1.0524 - accuracy: 0.7961 - val_loss: 1.9654 - val_accuracy: 0.6310
Epoch 48/200
266/266 [==============================] - 2s 6ms/step - loss: 1.0608 - accuracy: 0.7857 - val_loss: 1.8230 - val_accuracy: 0.6509
Epoch 49/200
266/266 [==============================] - 2s 7ms/step - loss: 1.0475 - accuracy: 0.7914 - val_loss: 1.8269 - val_accuracy: 0.6489
Epoch 50/200
266/266 [==============================] - 2s 7ms/step - loss: 0.9945 - accuracy: 0.7981 - val_loss: 2.1262 - val_accuracy: 0.6017
Epoch 51/200
266/266 [==============================] - 2s 7ms/step - loss: 0.9703 - accuracy: 0.8054 - val_loss: 1.8112 - val_accuracy: 0.6589
Epoch 52/200
266/266 [==============================] - 2s 7ms/step - loss: 0.9662 - accuracy: 0.8032 - val_loss: 1.6497 - val_accuracy: 0.6662
Epoch 53/200
266/266 [==============================] - 2s 7ms/step - loss: 0.9319 - accuracy: 0.8115 - val_loss: 2.0003 - val_accuracy: 0.6184
Epoch 54/200
266/266 [==============================] - 2s 7ms/step - loss: 0.9452 - accuracy: 0.8138 - val_loss: 1.5689 - val_accuracy: 0.6789
Epoch 55/200
266/266 [==============================] - 2s 7ms/step - loss: 0.9104 - accuracy: 0.8249 - val_loss: 1.7393 - val_accuracy: 0.6536
Epoch 56/200
266/266 [==============================] - 2s 7ms/step - loss: 0.9319 - accuracy: 0.8087 - val_loss: 1.7058 - val_accuracy: 0.6762
Epoch 57/200
266/266 [==============================] - 2s 7ms/step - loss: 0.9038 - accuracy: 0.8127 - val_loss: 1.8639 - val_accuracy: 0.6277
Epoch 58/200
266/266 [==============================] - 2s 6ms/step - loss: 0.8783 - accuracy: 0.8261 - val_loss: 1.9366 - val_accuracy: 0.6336
Epoch 59/200
266/266 [==============================] - 2s 7ms/step - loss: 0.8832 - accuracy: 0.8243 - val_loss: 1.6392 - val_accuracy: 0.6702
Epoch 60/200
266/266 [==============================] - 2s 7ms/step - loss: 0.8724 - accuracy: 0.8263 - val_loss: 1.7712 - val_accuracy: 0.6636
Epoch 61/200
266/266 [==============================] - 2s 7ms/step - loss: 0.8551 - accuracy: 0.8344 - val_loss: 1.9064 - val_accuracy: 0.6396
Epoch 62/200
266/266 [==============================] - 2s 6ms/step - loss: 0.8565 - accuracy: 0.8312 - val_loss: 1.9119 - val_accuracy: 0.6363
Epoch 63/200
266/266 [==============================] - 2s 6ms/step - loss: 0.8322 - accuracy: 0.8336 - val_loss: 1.7866 - val_accuracy: 0.6496
Epoch 64/200
266/266 [==============================] - 2s 6ms/step - loss: 0.8263 - accuracy: 0.8390 - val_loss: 1.8384 - val_accuracy: 0.6396
Epoch 65/200
266/266 [==============================] - 2s 6ms/step - loss: 0.8311 - accuracy: 0.8379 - val_loss: 1.8872 - val_accuracy: 0.6596
Epoch 66/200
266/266 [==============================] - 2s 6ms/step - loss: 0.8074 - accuracy: 0.8475 - val_loss: 1.7271 - val_accuracy: 0.6676
Epoch 67/200
266/266 [==============================] - 2s 7ms/step - loss: 0.8218 - accuracy: 0.8339 - val_loss: 1.9482 - val_accuracy: 0.6556
Epoch 68/200
266/266 [==============================] - 2s 6ms/step - loss: 0.7812 - accuracy: 0.8471 - val_loss: 1.8246 - val_accuracy: 0.6576
Epoch 69/200
266/266 [==============================] - 2s 6ms/step - loss: 0.8028 - accuracy: 0.8407 - val_loss: 1.7622 - val_accuracy: 0.6749
Epoch 70/200
266/266 [==============================] - 2s 7ms/step - loss: 0.7685 - accuracy: 0.8514 - val_loss: 1.7481 - val_accuracy: 0.6662
Epoch 71/200
266/266 [==============================] - 2s 6ms/step - loss: 0.7800 - accuracy: 0.8544 - val_loss: 1.6035 - val_accuracy: 0.6835
Epoch 72/200
266/266 [==============================] - 2s 6ms/step - loss: 0.7533 - accuracy: 0.8601 - val_loss: 1.6775 - val_accuracy: 0.6842
Epoch 73/200
266/266 [==============================] - 2s 6ms/step - loss: 0.7860 - accuracy: 0.8494 - val_loss: 1.7894 - val_accuracy: 0.6722
Epoch 74/200
266/266 [==============================] - 2s 6ms/step - loss: 0.7520 - accuracy: 0.8551 - val_loss: 1.6762 - val_accuracy: 0.6902
In [ ]:
_, accuracy = model_report(CNN2_MODEL_OPTIMIZED, CNN2_MODEL_OPTIMIZED_history)
accuracies_opt_RMSprop["CNN2"] = accuracy
Test set evaluation metrics
---------------------------
Loss:     1.590
Accuracy: 67.510%

Μεταφορά μάθησης

VGG16
In [ ]:
VGG16_MODEL_OPTIMIZED = init_VGG16_model_optimized(True, optimizer = tf.optimizers.RMSprop)
VGG16_MODEL_OPTIMIZED_history = train_model(VGG16_MODEL_OPTIMIZED, epochs = 200, callbacks = [callback])
Model: "sequential_5"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
vgg16 (Functional)           (None, 1, 1, 512)         14714688  
_________________________________________________________________
dropout_14 (Dropout)         (None, 1, 1, 512)         0         
_________________________________________________________________
global_average_pooling2d_2 ( (None, 512)               0         
_________________________________________________________________
dense_8 (Dense)              (None, 20)                10260     
=================================================================
Total params: 14,724,948
Trainable params: 14,724,948
Non-trainable params: 0
_________________________________________________________________
Epoch 1/200
266/266 [==============================] - 10s 34ms/step - loss: 2.9129 - accuracy: 0.1170 - val_loss: 1.7624 - val_accuracy: 0.4854
Epoch 2/200
266/266 [==============================] - 9s 33ms/step - loss: 1.7845 - accuracy: 0.4757 - val_loss: 1.3929 - val_accuracy: 0.5791
Epoch 3/200
266/266 [==============================] - 9s 34ms/step - loss: 1.2592 - accuracy: 0.6392 - val_loss: 1.0496 - val_accuracy: 0.6988
Epoch 4/200
266/266 [==============================] - 9s 33ms/step - loss: 0.9636 - accuracy: 0.7315 - val_loss: 0.9853 - val_accuracy: 0.7281
Epoch 5/200
266/266 [==============================] - 9s 33ms/step - loss: 0.7685 - accuracy: 0.7882 - val_loss: 0.9164 - val_accuracy: 0.7354
Epoch 6/200
266/266 [==============================] - 9s 33ms/step - loss: 0.5707 - accuracy: 0.8351 - val_loss: 1.1004 - val_accuracy: 0.7161
Epoch 7/200
266/266 [==============================] - 9s 33ms/step - loss: 0.4736 - accuracy: 0.8684 - val_loss: 1.0349 - val_accuracy: 0.7527
Epoch 8/200
266/266 [==============================] - 9s 33ms/step - loss: 0.3564 - accuracy: 0.9000 - val_loss: 1.2445 - val_accuracy: 0.7347
Epoch 9/200
266/266 [==============================] - 9s 33ms/step - loss: 0.3164 - accuracy: 0.9109 - val_loss: 1.2041 - val_accuracy: 0.7447
Epoch 10/200
266/266 [==============================] - 9s 33ms/step - loss: 0.2703 - accuracy: 0.9265 - val_loss: 1.1359 - val_accuracy: 0.7480
Epoch 11/200
266/266 [==============================] - 9s 33ms/step - loss: 0.2054 - accuracy: 0.9463 - val_loss: 1.1721 - val_accuracy: 0.7533
Epoch 12/200
266/266 [==============================] - 9s 33ms/step - loss: 0.2086 - accuracy: 0.9460 - val_loss: 1.6253 - val_accuracy: 0.7447
Epoch 13/200
266/266 [==============================] - 9s 33ms/step - loss: 0.2116 - accuracy: 0.9551 - val_loss: 1.6205 - val_accuracy: 0.7527
Epoch 14/200
266/266 [==============================] - 9s 33ms/step - loss: 0.2108 - accuracy: 0.9530 - val_loss: 1.4668 - val_accuracy: 0.7646
Epoch 15/200
266/266 [==============================] - 9s 33ms/step - loss: 0.1861 - accuracy: 0.9573 - val_loss: 1.4988 - val_accuracy: 0.7633
Epoch 16/200
266/266 [==============================] - 9s 34ms/step - loss: 0.1944 - accuracy: 0.9616 - val_loss: 1.6280 - val_accuracy: 0.7340
Epoch 17/200
266/266 [==============================] - 9s 33ms/step - loss: 0.1551 - accuracy: 0.9691 - val_loss: 2.0848 - val_accuracy: 0.7221
Epoch 18/200
266/266 [==============================] - 9s 33ms/step - loss: 0.2020 - accuracy: 0.9587 - val_loss: 1.8265 - val_accuracy: 0.7473
Epoch 19/200
266/266 [==============================] - 9s 33ms/step - loss: 0.1783 - accuracy: 0.9689 - val_loss: 1.7038 - val_accuracy: 0.7626
Epoch 20/200
266/266 [==============================] - 9s 33ms/step - loss: 0.1618 - accuracy: 0.9689 - val_loss: 1.8748 - val_accuracy: 0.7713
Epoch 21/200
266/266 [==============================] - 9s 33ms/step - loss: 0.2222 - accuracy: 0.9597 - val_loss: 1.6368 - val_accuracy: 0.7673
Epoch 22/200
266/266 [==============================] - 9s 33ms/step - loss: 0.1985 - accuracy: 0.9627 - val_loss: 2.5106 - val_accuracy: 0.7620
Epoch 23/200
266/266 [==============================] - 9s 33ms/step - loss: 0.2393 - accuracy: 0.9548 - val_loss: 3.9269 - val_accuracy: 0.6928
Epoch 24/200
266/266 [==============================] - 9s 33ms/step - loss: 0.2442 - accuracy: 0.9545 - val_loss: 2.7588 - val_accuracy: 0.7726
Epoch 25/200
266/266 [==============================] - 9s 33ms/step - loss: 0.2490 - accuracy: 0.9556 - val_loss: 1.9481 - val_accuracy: 0.7733
In [ ]:
_, accuracy = model_report(VGG16_MODEL_OPTIMIZED, VGG16_MODEL_OPTIMIZED_history)
accuracies_opt_RMSprop["VGG_ALL"] = accuracy
Test set evaluation metrics
---------------------------
Loss:     0.812
Accuracy: 76.637%
MobileNet
In [ ]:
MobileNetV2_MODEL_OPTIMIZED = init_MobileNetV2_model_optimized(True, optimizer = tf.optimizers.RMSprop)
MobileNetV2_MODEL_OPTIMIZED_history = train_model(MobileNetV2_MODEL_OPTIMIZED, train_dataset = train_ds_res, validation_dataset = validation_ds_res, epochs = 200, callbacks=[callback])
Downloading data from https://storage.googleapis.com/tensorflow/keras-applications/mobilenet_v2/mobilenet_v2_weights_tf_dim_ordering_tf_kernels_1.0_224_no_top.h5
9412608/9406464 [==============================] - 0s 0us/step
Model: "sequential_6"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
mobilenetv2_1.00_224 (Functi (None, 7, 7, 1280)        2257984   
_________________________________________________________________
dropout_15 (Dropout)         (None, 7, 7, 1280)        0         
_________________________________________________________________
global_average_pooling2d_3 ( (None, 1280)              0         
_________________________________________________________________
dense_9 (Dense)              (None, 20)                25620     
=================================================================
Total params: 2,283,604
Trainable params: 2,249,492
Non-trainable params: 34,112
_________________________________________________________________
Epoch 1/200
266/266 [==============================] - 69s 229ms/step - loss: 1.6012 - accuracy: 0.5541 - val_loss: 2.4167 - val_accuracy: 0.4156
Epoch 2/200
266/266 [==============================] - 60s 225ms/step - loss: 0.3286 - accuracy: 0.9013 - val_loss: 2.8649 - val_accuracy: 0.4515
Epoch 3/200
266/266 [==============================] - 60s 226ms/step - loss: 0.1408 - accuracy: 0.9569 - val_loss: 3.1315 - val_accuracy: 0.4262
Epoch 4/200
266/266 [==============================] - 60s 224ms/step - loss: 0.0776 - accuracy: 0.9758 - val_loss: 3.0855 - val_accuracy: 0.4016
Epoch 5/200
266/266 [==============================] - 60s 225ms/step - loss: 0.0457 - accuracy: 0.9878 - val_loss: 4.6597 - val_accuracy: 0.3763
Epoch 6/200
266/266 [==============================] - 60s 226ms/step - loss: 0.0328 - accuracy: 0.9913 - val_loss: 2.6763 - val_accuracy: 0.4814
Epoch 7/200
266/266 [==============================] - 60s 226ms/step - loss: 0.0260 - accuracy: 0.9922 - val_loss: 3.4168 - val_accuracy: 0.4927
Epoch 8/200
266/266 [==============================] - 60s 226ms/step - loss: 0.0194 - accuracy: 0.9941 - val_loss: 3.3080 - val_accuracy: 0.5266
Epoch 9/200
266/266 [==============================] - 60s 226ms/step - loss: 0.0230 - accuracy: 0.9930 - val_loss: 2.9330 - val_accuracy: 0.5718
Epoch 10/200
266/266 [==============================] - 60s 227ms/step - loss: 0.0241 - accuracy: 0.9918 - val_loss: 1.8357 - val_accuracy: 0.6649
Epoch 11/200
266/266 [==============================] - 60s 227ms/step - loss: 0.0170 - accuracy: 0.9951 - val_loss: 2.0943 - val_accuracy: 0.6682
Epoch 12/200
266/266 [==============================] - 60s 226ms/step - loss: 0.0156 - accuracy: 0.9949 - val_loss: 1.4283 - val_accuracy: 0.7427
Epoch 13/200
266/266 [==============================] - 60s 227ms/step - loss: 0.0126 - accuracy: 0.9967 - val_loss: 1.2326 - val_accuracy: 0.7779
Epoch 14/200
266/266 [==============================] - 60s 227ms/step - loss: 0.0114 - accuracy: 0.9968 - val_loss: 1.1581 - val_accuracy: 0.7912
Epoch 15/200
266/266 [==============================] - 60s 227ms/step - loss: 0.0111 - accuracy: 0.9962 - val_loss: 0.9347 - val_accuracy: 0.8364
Epoch 16/200
266/266 [==============================] - 60s 227ms/step - loss: 0.0147 - accuracy: 0.9956 - val_loss: 1.0613 - val_accuracy: 0.8138
Epoch 17/200
266/266 [==============================] - 60s 227ms/step - loss: 0.0106 - accuracy: 0.9970 - val_loss: 0.8252 - val_accuracy: 0.8551
Epoch 18/200
266/266 [==============================] - 60s 227ms/step - loss: 0.0143 - accuracy: 0.9957 - val_loss: 0.9313 - val_accuracy: 0.8398
Epoch 19/200
266/266 [==============================] - 60s 227ms/step - loss: 0.0064 - accuracy: 0.9977 - val_loss: 0.7336 - val_accuracy: 0.8637
Epoch 20/200
266/266 [==============================] - 60s 227ms/step - loss: 0.0114 - accuracy: 0.9964 - val_loss: 0.9166 - val_accuracy: 0.8457
Epoch 21/200
266/266 [==============================] - 61s 228ms/step - loss: 0.0084 - accuracy: 0.9971 - val_loss: 0.8648 - val_accuracy: 0.8590
Epoch 22/200
266/266 [==============================] - 61s 228ms/step - loss: 0.0084 - accuracy: 0.9969 - val_loss: 0.9124 - val_accuracy: 0.8624
Epoch 23/200
266/266 [==============================] - 61s 228ms/step - loss: 0.0091 - accuracy: 0.9964 - val_loss: 1.0188 - val_accuracy: 0.8597
Epoch 24/200
266/266 [==============================] - 60s 227ms/step - loss: 0.0083 - accuracy: 0.9970 - val_loss: 0.7978 - val_accuracy: 0.8730
Epoch 25/200
266/266 [==============================] - 61s 228ms/step - loss: 0.0103 - accuracy: 0.9970 - val_loss: 0.8110 - val_accuracy: 0.8657
Epoch 26/200
266/266 [==============================] - 61s 228ms/step - loss: 0.0051 - accuracy: 0.9986 - val_loss: 0.7402 - val_accuracy: 0.8777
Epoch 27/200
266/266 [==============================] - 61s 229ms/step - loss: 0.0069 - accuracy: 0.9978 - val_loss: 1.0002 - val_accuracy: 0.8464
Epoch 28/200
266/266 [==============================] - 61s 228ms/step - loss: 0.0111 - accuracy: 0.9969 - val_loss: 0.9896 - val_accuracy: 0.8391
Epoch 29/200
266/266 [==============================] - 60s 227ms/step - loss: 0.0065 - accuracy: 0.9979 - val_loss: 0.9619 - val_accuracy: 0.8511
Epoch 30/200
266/266 [==============================] - 60s 227ms/step - loss: 0.0100 - accuracy: 0.9969 - val_loss: 0.9199 - val_accuracy: 0.8677
Epoch 31/200
266/266 [==============================] - 61s 228ms/step - loss: 0.0097 - accuracy: 0.9966 - val_loss: 1.0055 - val_accuracy: 0.8497
Epoch 32/200
266/266 [==============================] - 60s 227ms/step - loss: 0.0066 - accuracy: 0.9969 - val_loss: 0.8823 - val_accuracy: 0.8531
Epoch 33/200
266/266 [==============================] - 61s 228ms/step - loss: 0.0100 - accuracy: 0.9965 - val_loss: 0.7183 - val_accuracy: 0.8644
Epoch 34/200
266/266 [==============================] - 61s 228ms/step - loss: 0.0062 - accuracy: 0.9983 - val_loss: 0.7421 - val_accuracy: 0.8783
Epoch 35/200
266/266 [==============================] - 61s 228ms/step - loss: 0.0033 - accuracy: 0.9993 - val_loss: 0.8272 - val_accuracy: 0.8850
Epoch 36/200
266/266 [==============================] - 60s 226ms/step - loss: 0.0072 - accuracy: 0.9981 - val_loss: 1.0733 - val_accuracy: 0.8684
Epoch 37/200
266/266 [==============================] - 60s 227ms/step - loss: 0.0114 - accuracy: 0.9962 - val_loss: 1.1948 - val_accuracy: 0.8517
Epoch 38/200
266/266 [==============================] - 60s 227ms/step - loss: 0.0052 - accuracy: 0.9980 - val_loss: 1.0910 - val_accuracy: 0.8517
Epoch 39/200
266/266 [==============================] - 61s 228ms/step - loss: 0.0058 - accuracy: 0.9985 - val_loss: 1.0236 - val_accuracy: 0.8577
Epoch 40/200
266/266 [==============================] - 60s 227ms/step - loss: 0.0074 - accuracy: 0.9983 - val_loss: 0.9929 - val_accuracy: 0.8684
Epoch 41/200
266/266 [==============================] - 60s 226ms/step - loss: 0.0074 - accuracy: 0.9976 - val_loss: 0.9829 - val_accuracy: 0.8597
Epoch 42/200
266/266 [==============================] - 60s 227ms/step - loss: 0.0064 - accuracy: 0.9979 - val_loss: 0.9873 - val_accuracy: 0.8570
Epoch 43/200
266/266 [==============================] - 60s 227ms/step - loss: 0.0079 - accuracy: 0.9974 - val_loss: 1.0826 - val_accuracy: 0.8444
Epoch 44/200
266/266 [==============================] - 60s 227ms/step - loss: 0.0075 - accuracy: 0.9981 - val_loss: 1.1868 - val_accuracy: 0.8497
Epoch 45/200
266/266 [==============================] - 60s 226ms/step - loss: 0.0041 - accuracy: 0.9986 - val_loss: 1.1081 - val_accuracy: 0.8590
Epoch 46/200
266/266 [==============================] - 60s 227ms/step - loss: 0.0047 - accuracy: 0.9985 - val_loss: 1.1534 - val_accuracy: 0.8484
Epoch 47/200
266/266 [==============================] - 60s 227ms/step - loss: 0.0069 - accuracy: 0.9976 - val_loss: 1.0213 - val_accuracy: 0.8677
Epoch 48/200
266/266 [==============================] - 60s 227ms/step - loss: 0.0072 - accuracy: 0.9979 - val_loss: 1.0831 - val_accuracy: 0.8484
Epoch 49/200
266/266 [==============================] - 60s 226ms/step - loss: 0.0073 - accuracy: 0.9986 - val_loss: 1.0931 - val_accuracy: 0.8644
Epoch 50/200
266/266 [==============================] - 60s 227ms/step - loss: 0.0054 - accuracy: 0.9982 - val_loss: 0.9343 - val_accuracy: 0.8743
Epoch 51/200
266/266 [==============================] - 60s 227ms/step - loss: 0.0082 - accuracy: 0.9979 - val_loss: 0.9329 - val_accuracy: 0.8664
Epoch 52/200
266/266 [==============================] - 60s 227ms/step - loss: 0.0052 - accuracy: 0.9984 - val_loss: 1.0278 - val_accuracy: 0.8590
Epoch 53/200
266/266 [==============================] - 60s 227ms/step - loss: 0.0062 - accuracy: 0.9977 - val_loss: 0.9590 - val_accuracy: 0.8743
In [ ]:
_, accuracy = model_report(MobileNetV2_MODEL_OPTIMIZED, MobileNetV2_MODEL_OPTIMIZED_history, test_ds_res)
accuracies_opt_RMSprop["MOBILENET_ALL"] = accuracy
Test set evaluation metrics
---------------------------
Loss:     0.712
Accuracy: 87.798%
DenseNet
In [ ]:
DENSENET_MODEL_OPTIMIZED = init_DENSENET_model_optimized(True, optimizer = tf.optimizers.RMSprop)
DENSENET_MODEL_OPTIMIZED_history = train_model(DENSENET_MODEL_OPTIMIZED, epochs = 200, callbacks=[callback])
Downloading data from https://storage.googleapis.com/tensorflow/keras-applications/densenet/densenet121_weights_tf_dim_ordering_tf_kernels_notop.h5
29089792/29084464 [==============================] - 0s 0us/step
Model: "sequential_7"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
densenet121 (Functional)     (None, 1, 1, 1024)        7037504   
_________________________________________________________________
dropout_16 (Dropout)         (None, 1, 1, 1024)        0         
_________________________________________________________________
global_average_pooling2d_4 ( (None, 1024)              0         
_________________________________________________________________
dense_10 (Dense)             (None, 20)                20500     
=================================================================
Total params: 7,058,004
Trainable params: 6,974,356
Non-trainable params: 83,648
_________________________________________________________________
Epoch 1/200
266/266 [==============================] - 33s 58ms/step - loss: 3.5491 - accuracy: 0.1582 - val_loss: 1.7876 - val_accuracy: 0.5286
Epoch 2/200
266/266 [==============================] - 14s 53ms/step - loss: 1.8021 - accuracy: 0.4916 - val_loss: 1.2096 - val_accuracy: 0.6676
Epoch 3/200
266/266 [==============================] - 15s 56ms/step - loss: 1.2675 - accuracy: 0.6309 - val_loss: 1.0795 - val_accuracy: 0.6961
Epoch 4/200
266/266 [==============================] - 14s 54ms/step - loss: 0.9484 - accuracy: 0.7215 - val_loss: 1.1439 - val_accuracy: 0.7074
Epoch 5/200
266/266 [==============================] - 14s 51ms/step - loss: 0.7423 - accuracy: 0.7708 - val_loss: 0.9565 - val_accuracy: 0.7447
Epoch 6/200
266/266 [==============================] - 14s 52ms/step - loss: 0.5980 - accuracy: 0.8155 - val_loss: 0.9287 - val_accuracy: 0.7633
Epoch 7/200
266/266 [==============================] - 16s 60ms/step - loss: 0.5090 - accuracy: 0.8421 - val_loss: 0.9048 - val_accuracy: 0.7739
Epoch 8/200
266/266 [==============================] - 15s 56ms/step - loss: 0.3606 - accuracy: 0.8913 - val_loss: 0.9092 - val_accuracy: 0.7879
Epoch 9/200
266/266 [==============================] - 12s 46ms/step - loss: 0.3068 - accuracy: 0.9083 - val_loss: 0.9986 - val_accuracy: 0.7693
Epoch 10/200
266/266 [==============================] - 14s 52ms/step - loss: 0.2376 - accuracy: 0.9282 - val_loss: 1.0623 - val_accuracy: 0.7832
Epoch 11/200
266/266 [==============================] - 13s 49ms/step - loss: 0.2135 - accuracy: 0.9347 - val_loss: 1.1108 - val_accuracy: 0.7653
Epoch 12/200
266/266 [==============================] - 14s 52ms/step - loss: 0.1844 - accuracy: 0.9431 - val_loss: 1.0593 - val_accuracy: 0.7859
Epoch 13/200
266/266 [==============================] - 14s 54ms/step - loss: 0.1711 - accuracy: 0.9493 - val_loss: 1.0627 - val_accuracy: 0.7839
Epoch 14/200
266/266 [==============================] - 13s 50ms/step - loss: 0.1246 - accuracy: 0.9595 - val_loss: 1.1470 - val_accuracy: 0.7819
Epoch 15/200
266/266 [==============================] - 14s 53ms/step - loss: 0.1268 - accuracy: 0.9614 - val_loss: 1.1861 - val_accuracy: 0.7766
Epoch 16/200
266/266 [==============================] - 14s 51ms/step - loss: 0.1313 - accuracy: 0.9582 - val_loss: 1.1803 - val_accuracy: 0.7759
Epoch 17/200
266/266 [==============================] - 14s 53ms/step - loss: 0.1057 - accuracy: 0.9671 - val_loss: 1.1670 - val_accuracy: 0.7919
Epoch 18/200
266/266 [==============================] - 14s 51ms/step - loss: 0.0985 - accuracy: 0.9667 - val_loss: 1.4473 - val_accuracy: 0.7633
Epoch 19/200
266/266 [==============================] - 15s 56ms/step - loss: 0.0875 - accuracy: 0.9729 - val_loss: 1.2239 - val_accuracy: 0.7653
Epoch 20/200
266/266 [==============================] - 16s 59ms/step - loss: 0.0958 - accuracy: 0.9730 - val_loss: 1.2981 - val_accuracy: 0.7766
Epoch 21/200
266/266 [==============================] - 13s 48ms/step - loss: 0.0915 - accuracy: 0.9703 - val_loss: 1.2597 - val_accuracy: 0.7713
Epoch 22/200
266/266 [==============================] - 12s 46ms/step - loss: 0.0632 - accuracy: 0.9820 - val_loss: 1.3939 - val_accuracy: 0.7699
Epoch 23/200
266/266 [==============================] - 15s 58ms/step - loss: 0.0770 - accuracy: 0.9743 - val_loss: 1.1635 - val_accuracy: 0.7793
Epoch 24/200
266/266 [==============================] - 15s 58ms/step - loss: 0.0742 - accuracy: 0.9766 - val_loss: 1.3516 - val_accuracy: 0.7819
Epoch 25/200
266/266 [==============================] - 14s 54ms/step - loss: 0.0752 - accuracy: 0.9787 - val_loss: 1.4104 - val_accuracy: 0.7819
Epoch 26/200
266/266 [==============================] - 14s 54ms/step - loss: 0.0595 - accuracy: 0.9804 - val_loss: 1.2955 - val_accuracy: 0.7779
Epoch 27/200
266/266 [==============================] - 14s 52ms/step - loss: 0.0567 - accuracy: 0.9815 - val_loss: 1.3873 - val_accuracy: 0.7653
In [ ]:
_, accuracy = model_report(DENSENET_MODEL_OPTIMIZED, DENSENET_MODEL_OPTIMIZED_history)
accuracies_opt_RMSprop["DENSENET_ALL"] = accuracy
Test set evaluation metrics
---------------------------
Loss:     0.979
Accuracy: 76.141%

Bar plots σύγκρισης

In [ ]:
# set width of bar
barWidth = 0.15
model_names = ['Simple Model', 'CNN1', 'CNN2', 'VGG16', 'MobileNet', 'DenseNet']

# set height of bars
bar1 = [accuracies_opt["SIMPLE_MODEL"],accuracies_opt["CNN1"],accuracies_opt["CNN2"],accuracies_opt["VGG_ALL"],accuracies_opt["MOBILENET_ALL"],accuracies_opt["DENSENET_ALL"]]
bar2 = [accuracies_opt_Nadam["SIMPLE_MODEL"],accuracies_opt_Nadam["CNN1"],accuracies_opt_Nadam["CNN2"],accuracies_opt_Nadam["VGG_ALL"],accuracies_opt_Nadam["MOBILENET_ALL"],accuracies_opt_Nadam["DENSENET_ALL"]]
bar3 = [accuracies_opt_SGD["SIMPLE_MODEL"],accuracies_opt_SGD["CNN1"],accuracies_opt_SGD["CNN2"],accuracies_opt_SGD["VGG_ALL"],accuracies_opt_SGD["MOBILENET_ALL"],accuracies_opt_SGD["DENSENET_ALL"]]
bar4 = [accuracies_opt_RMSprop["SIMPLE_MODEL"],accuracies_opt_RMSprop["CNN1"],accuracies_opt_RMSprop["CNN2"],accuracies_opt_RMSprop["VGG_ALL"],accuracies_opt_RMSprop["MOBILENET_ALL"],accuracies_opt_RMSprop["DENSENET_ALL"]]

# Set position of bar on X axis
r1 = np.arange(6)
r2 = [x + barWidth for x in r1]
r3 = [x + barWidth for x in r2]
r4 = [x + barWidth for x in r3]


plt.figure(figsize=(12,4))
plt.bar(r1, bar1, color='#003f5c', width=barWidth, edgecolor='white', label = 'Adam')
plt.bar(r2, bar2, color='#ffa600', width=barWidth, edgecolor='white', label = 'Nadam')
plt.bar(r3, bar3, color='#bc5090', width=barWidth, edgecolor='white', label = 'SGD')
plt.bar(r4, bar4, color='#25A640', width=barWidth, edgecolor='white', label = 'RMSprop')
plt.xticks([r + barWidth for r in range(6)], model_names)
plt.ylim(bottom=0.1)
plt.legend(loc='best')
plt.title("Experiments on Optimizer")
plt.ylabel("Classification Accuracy")
plt.grid(axis="y", linestyle="--")
plt.show()

Παρατηρούμε πως οι optimizers Adam, Nadam και RMSprop παρουσιάζουν πολύ παρόμοια επίδοση σε όλα τα μοντέλα. Από την άλλη, ο αλγόριθμος βελτιστοποίησης SGD φαίνεται πως δεν αποδίδει καλά για τα from scratch δίκτυα. Αυτό οφείλεται στο γεγονός ότι συγκλίνει με πολύ με πιο αργό ρυθμό από ότι οι υπόλοιποι τρεις, με αποτέλεσμα να χρειάζεται σημαντικά μεγαλύτερο αριθμό εποχών (περισσότερες από 200) για να μπορέσει να προσεγγίσει την ακρίβεια τους. Στα μοντέλα του Trasnfer learning και οι τέσσερις optimizers έχουν ανάλογη συμπεριφορά ως προς το test accuracy.